var/home/core/zuul-output/0000755000175000017500000000000015136117255014533 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015136126507015477 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000317410315136126415020263 0ustar corecore xikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD pJ}YI_翪|mvşo#oVݏKf+ovpZjl!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPF:c0Ys66q tH6#.`$vlLH}ޭA㑝V0>|J\Pg\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\[.!=A(%Ud,Sc̝G?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}~Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&V<UKkZ{iqi :íy˧FR1M VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtWǭKv4@Va3"a`R@gbu%_J5Ґ 3?lm$K/$s_. WM]̍"W%`lO2-"ew@bM?O *Xa>EE衢^}p/:F?}bi0>Oh%\x(bdF"F 'u Qx`j#(g6zƯRo(lџŤnE7^k(>QQ!xYr猸p$nu̿ݣ\)#s{p'ɂN$r;fVkvo\mkmB`s ~7!GdјCyEߖs|n|zu0VhI|{}BC6q>HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b&툒f!y̯RE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘ3ڴ(|e@|Bһͻ Aw!5ޖ=8pߟTk@2pos/*W#@UTkտ,Fպ̥ 9MGb&0ۺ*u8@8/X[1fiMiT+9[ŗ6 BN=rR60#tE#u2k *+e7[YU6Msj$wբh+8kMZY9X\u7Kp:= ^҃5M>!6~ö9M^\r\ߺnqZV@z%=\#|-3ڝa$ΫM|-LsXY r# v&讳YE 6X̀v"@L'aE p6mD[%ZZv'#NC5CL]5ݶI5XK.N)Q!>zt?zpPC ¶.vBTcm"Bsp rjWhUuta^xN@˭d- T5 $4ذufw}}1L @5DO'h‡k;q 7= `!6зd B0C?]lja~ luq=T#>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwOj-25Hݳ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@>T%TDNIGW .Z#YmDvS|]My̥Q\ګ1F#șcq##rI$I.imWMUF>su0,gy(&TI޽*}w[ #j*ٚ- DIAmPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{D1kl)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua Ȼݔhvׄӫ A^%f+[`sb˟ _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|//>l8MrHID2VSsMX^"NۯDc558c&'K0L /C5YDqNe~ض˸nErc֋@aw*r܀0 a {RQ^xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$wMm[eG`̵E$uLrk-$_{$# $B*hN/ٟPE[/Y5d{zrBܖ6Hlc "mKv~[uLU4lZ;xEN'oI㤛rP*jC# 6@dmHg1$ʇȠh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B-3%OONw8d`Q4d$x0t8@t]y1T\YAidtxBG:pɨyeNg4n]M؞ e}Wn6׳i~'ہZ*FU{fXڃP'Hl4 ,ŸqMHDCYZz Qnz܁$Jp04ȴIL΃.0FiO-qy)i_TA|S2G4miBȨHM(2hys|F 94 DNlϒòκ-q|xC ,gKDzHR%t+E/wd#礱ºȄWEz o\JξB.wLKZ39(M +(PWՇfR6#ю3Ȋt ݪbh]MTw䀩S]'qf&)-_G;"1qz퇛0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=rOf'/wѹ8BS8]`;=?,ڼ"ϴq*(A7? /W= #^ub"6q f+=^OI@߱^F[n4A#bYѤwd)J^Z{*ǥzw73LuaVad=$6)iI gC~.1%YmҪ+2gSt!8iIۛ*JgE7LGoş\bC}O i ycK1YhO6 /g:KT sPv6l+uN|!"VS^΄t*3b\N7dYܞLcn3rnNd8"is"1- ޑܧd[]~:'#;N(NknfV('I rcj2J1G<5 Nj̒Qh]ꍾZBn&Un' CyUM0nCj.&Oڣg\q0^Ϻ%4i" ZZG>Xr'XKc$2iσֹH<6N8HSg>uMik{Fm(W F@@{W+ߑ?X2hS4-=^YgpUHެbZ!y!ul@ڼ63" ۩:6=TZõ$E,ϓRV|G&$rr;J TtIHFE=RȬ]P pLm|?$%>Eü%mWO[>Xmw,*9.[G n >X8Ī;xW%dT:`ٓ~:QO,}j6j!yڦʲT:Pqҋh] H+&=>g| Z;D8ܶb:! Å{2:+au 6:!fF+0#+̬NY"!6a7#񕪰%:r|o5Znڧs?si/W qEU馥˟^_޶oڷOj'?nc]Rn\t3^邳塨Lɏ"8k8M~?M}OAH$77f|lgn I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ4i36$>m|_>9|dUA"{!$jKx E$K3hN(tÊ-#v#O N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc?)cl*&<}P媠E{-sVU>߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэor,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[@ QPltsHtQ$J==O!;*>ohǖVa[|E7e0ϕ9Uyzg%pg/cc6RS`HFLЩ LkJu\!`0);Sak$Vfp~C%YdE6c>1ƕ (0W4Q>@>lWN"^ X5G-nm.8B>NOI[31,j2 Ce |M>8l WIf|\q4|UkC.gr`˱Lϰ} xr.~l-ɩu_Drd31V_ѺUib0/ %IYhq ҕ  O UA!wY~ -`%Űb`\mS38W1`vOF7/.C!Pu&Jm l?Q>}O+D7 P=x@`0ʿ26a>d Bqε^a'NԋsI`Yu.7v$Rt)Ag:ݙyX|HkX cU82IP qgzkX=>׻K߉J%E92' ]qҙ%rXgs+"sc9| ]>T]"JرWBΌ-zJS-~y30G@U#=h7) ^EUB Q:>9W΀çM{?`c`uRljצXr:l`T~IQg\Ѝpgu#QH! ,/3`~eB|C1Yg~ؼ/5I7w9I}qww}U~7뭱ԏ,}e7]ukDn`jSlQ7DžHa/EU^IpYWW兹Q7WyTz|nˇ _qˍ[!;n ^b k[);ng]ȶM_u)O_xV hx h[K2kـ`b duhq[..cS'5YO@˒ӓdcY'HAKq^$8`b $1r Qz?ۧ1ZM/G+qYcYl YhD$kt_TId E$dS:֢̆ ?GЅ'JƖ'ZXO݇'kJՂU086\h%1GK(Yn% ']Q; Gd:!gI-XEmkF}:~0}4t3Qf5xd\hEB-} |q*ȃThLj'sQ %؇Gk`F;Sl\h)5؈x2Ld="KԦ:EVewN ًS9d#$*u>>I#lX9vW !&H2kVyKZt<cm^] bCD6b&>9VE7e4p +{&g߷2KY,`Wf1_ܑMYٚ'`ySc4ΔV`nI+ƳC6;җ2ct"*5S}t)eNqǪP@o`co ˎ<عLۀG\ 7۶+q|YRiĹ zm/bcK3;=,7}RqT vvFI O0]&5uKMf#pDTk6yi*cem:y0W|1u CWL;oG^\ X5.aRߦ[_Vs? Ž^A12JQ̛XL:OEUپOY>WK-uP0\8"M: /P4Qz~j3 .-8NJ|!N9/|a|>lX9T ҇t~T1=UF"t; 8-1I|2L+)WȱL˿ˍ-038D*0-)ZyT13`tTnm|Yhi+lQ&Z!֨řoҒ"HKX 6„=z{Ҍ5+P1;ڇ6UNE@Uo/>8.fgW]kY0Cgcu6/!_Ɩ} ' Ў3)X<seWfSv!ؒRKfs%(1Lhrٵ L.] s?I,HCԢ[b C-lLG+@_$c%* _jR|\:dc5u= A@kUc\ǔz;M>dUN/aFRĦ@x؂ǀ$6%}N^ \mQ!%8j0dUo=rh>*YȴU3Q,̸*E%59sTzɟڮ2kg ۱wEUD3uKrr&"B:p`\E)j<).R&#ÃecE,dp"nPS 44 Q8ZƈKnnJei+^z '3JDbSK;*uБ:hF ѹ @˿ޗ~7g9| hLXULi7.1-Qk%ƩJ4^=ple;u.6vQe UZAl *^Vif]>HUd6ƕ̽=T/se+ϙK$S`hnOcE(Tcr!:8UL | 8 !t Q7jk=nn7J0ܽ0{GGL'_So^ʮL_'s%eU+U+ȳlX6}i@djӃfb -u-w~ r}plK;ֽ=nlmuo[`wdй d:[mS%uTڪ?>={2])|Ը>U{s]^l`+ ja^9c5~nZjA|ЩJs Va[~ۗ#rri# zLdMl?6o AMҪ1Ez&I2Wwߎ|7.sW\zk﯊溺^TW^T\*6eqr/^T77WNZ7F_}-򲺺VWQ77V\_v?9?"Th $LqQjiXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1M'I`22؆DT!/j璓P åiw@wgRCsT~$U>ceއE)BI>UljO|Ty$ŋrwOtZ7$ "i 8U 7bSem'k?I]kS+./%UT,.gUgk͌ldI-`[:=K@2!-(@tɆ^; SdpgonCG40T(K ޑE^ĥ+mhƱ6j(L\BIv3Xk+º(NWa=K/?Q%442T|>Bȷ ⹜C$)Pb)~ N1TM>܍?TS,'bPX"n2Tsa|B3>C~xa O)D?100 B^W9p0 wj@ ]Rml$vIو]7,Ds&707 ^r[4_TE'%kv$=ǖ~獐~uIe9#x]Rq=2 ڱ6ts%5z#?iʹc,bq{@Q<+A9pϴ^R. ^g%òperu h;JEO0l_D=mLVmdME&Wd{v&ؗ+;yfIӒ&gDᡰ_}oY! \7!98Bp!9 ]]٭ 1mCst?쐛2]c8wGx:gdak־ZZUlRhuB7<Cxls}zpa9cK7|Tͣ:E5hEUV "0yM*#Ͽm幞T(͓{}sycx?򤛮9O|&t__ oi 1.\j P|DIky.Ln;2$ c?kuUӋ%-gqTM.}E iL;5bI=Hd˸Ud/e/w7Pm;Lyg/Zˋ(U?2ٞ}ˑ 3-#yIaeَ+ pcPz7i,gu~DE@uAcǓXn~ C*VEqL8mHUGTd :9/2ZwvهMGM+xd*,CB)ӑ9*)ތ=Q 6*F,PN1U39 QK_GㇿUxt.hVUyvrvEIrҌ%oIpX2G(ʊbxÛZb*D]4a ^JIxaf# VELGI ,<[(/ J܊xh㙥i($G?A/(ؚs6+ -#+xt!q7^XTZzߎ/Hȣ/ِX7־7a6G푃4όF{OA3d8qr%n?2`H#奊YRfK=L;kMکgos'z. b #2ZרNqf|}cx V?>SpD&u^;wpv|\$J5SY]e幜αi:Z$oMd% 48×f=UD3&{njh ߲b -ͳ|ty?Eq',E {s%}[, L%ޚ;=l9<NF$U!x'O'l?4#q\0Izw^ķu LX)ciqomHD>I?*>(q7|F"!ZQ$Kc!A0}N;BNbm<};/e**F㱽^*PV Ci)\]h̏h<փ?Y,/gm!n~a<6ۗӚ*$|nd{:p9ĵnY%.%tT=jE tMtp1&% fز7 <(k5ۑ1!paO7#y +):-Y$q#)/zA4KoY]jxpOBtb'݂6ohhI<=C2TI:Uv%Y]E,)*0 hM濿g@YmaArn n;+3x$U8dUBeSXT:4;-^I^d|^arIR'Xj*%#d6On?V*+|s2a Sw4Tڗ}#FamUj[Yt` ē\!Z: PmiD,o mhhk0HQme;*pOyx9l :4$S"ug_fx:3ˬ9J#^.~v9b џ[ ;)ݔoOxG?= 4B!!k쁈elJDK\90">0#"@f12˛#b;ykGk`=̇<'8MN4kQ˛=HzsQ)ڗDO͑y~mMDu:b9dyU$WQ2Sx{>>W;pY!џgWw>T9ogRȭ*6d 1lj. +"[Px` UXTD||~_ʁ_@tn`lԞ0%Q u9QjO6ͱ\R q^';24̦ {LC+ E^|}L (cs UIǃ` s~ ;d0R+j91ZhU?67h|"!PxUz kx:Ӫ<,ç}8ɮ҂Z_RˋoKDTs.js9SI>HJ{,H'&׵aHx,0* +ؐWr`ndD ,`]4I՘U=-L KET̎4x*ՍY=p:p<%>mU׀`Z]Ľ+OWFI]c$zjP\2QpE%Gn(:pȕz?y_W"7UrEx\ QUֶoVSuh]H\JIdJ_@pWƑ,+., sEMɑ!Aqa "mS0k〬D`+f%(Brff ӗbﹴ27cpwM2zpRb Acdž hóhs|E%r|v5[&7}mpXx}?ʋX3thZٜn\F3T pU:n V lW4/8*h%XP9C{_kvд$㹶Ǻp[\_9c㾵,s>W& y v@L#~s |= _w2]B(* / j;z Sk,qx7PYUjkKW"\Q#0Tݪ  '5+6!!-KQtH .iENQL3)>!ZnR{4`vj4<-l/y]!IXST fY@Pgh>;DUr]j\d/NT;vcREf)`Jf>iKHKeU]lQ2D3|Uhm)92C :dN> wS˕0޲f|dupoFA{3 [_]Uy@*]`i^u}x~ֻ]ܘy{tM; Bf s!.͞L3I QI:.{=72ɹ,nWy-G?taZMUiǭ|GUX)o2,;uȈtz U!ҺT U.ohtYMo>QVн+x%pѪ,IWf[@ICP^Y-}uT0 1sTo>z\[6nd}oi͸5aƦϛA;۰x;.B+LS#ݍ[ _@߁ۡ +םݸdoT 4 |zX?MHݑmܵۅ cH{&sݑ;r6޳wcZ E?K|DͫAJ?Gt g!B߅9nW /XXn$D*?b;?h'4Nn t]h$ D\{OߍFQHۅp@'w>N|o2ڍf^HŧZn"Puź`x|!+N?&v#2 4i,]jlc7#o)Br;).Ħ*(/ݗH]XNblfH%ص@P멁vp!.D5 +jE.S62XWzr0[|{ wu 3\L8玫g QSNN0i&qźg-su_ORPod殴U[6oMfgYi/|WIЀ~e>|&;[i,8<骣Q\b?}}xeC A3doI5%сqC{t[@և+x5 o޿9:0?:>藭ȿ=x;|(?;~(4.hC@DFbb#"xm8.ݕ h 0)aOEA7؃E%+0aE,~7UguHN%#i7\XӁ;lt =ibυ$ 6nv,uo[uzai1:юx_NՂgw^iT:un0PK:0pQ=EB$H1y!ͰT0}xlp*9S@n-RS> \D3/K%\c$ȵ"4 6 2) ڼ;-rNC*)ByIe1&F&.|З`y7!jo\U6RwCFgɯMD8MZ+k^\Rŕ Fu//xS4p 80 x}~ӳ;LAl[gZ FA_ÂŇ ,NGҮgѤUY6+cx )俽GL@!Ze1"S>D$' ! dɠɈ|4gi̡Ouz3Qu9U_.=⫏FDR;,eֳm^ha5,*A Z`ѪQw@POF.m7}]OC @wY8:;&7BW&Sn0Ǔ^upV7G[E{U)I%r44qV0d&a80;ZgK*CyxrqqttTѭ{iŦ㥶9`gvl׏ivӿIcG1Դ}R "OoE e:*\Kn&9!zK,>قX/_1j~)("|?m x:!N-FP_خ- bu9ٗgԥ|yFr~x*0[٧<(jjPgET X}7Ej 6V&Acm +s,sG) pl?P7GN/~&iQ~‘$#t?`sg蜟.y5&p5"D$^O'p\ϱLoG7Ȼ_O'"(d8N_V\hI@^y~]c=]\rwpfʫl @[h8ΐ9g^'YBBK`Nm 2y<;fNp t0 ttЌ 6?cC meAi^fx/I3g9V \;C2770l'*X縒<S׆JZ&XS۷kv4==䆖uA4 Wr;bh}XF,6kE 'rϮ0a0Gpf3ӥ8r7 e?WMeJeOoGtq4H"?CȇbZQFqzKb+آe%G ^6xs34D6F ֦A*Y9GuI]۱bLgI֠]a\AFi6i~~!/y. J7n.(MPtAWnҧtN;*ݶ[Fڱ6]knݹ,ڻjo)lYPlsAn-e, l n:[ l ,悺 n)޲z &˂o., l hn[ l h,h悆 n)hFЅ\8CP2AjyRNwdP;<3 O]NQ>V7yUh8 #b>#U>YH%= Iv xaXΨ`C⥿ _3?=,ӱk^6 /ޗt!$+Wu3ޟyL˼|22ωC^МlԂ]"/q yr} N2Vl6r>G*@ѐwb,>&ȁyhK8pORyZt ҏI"47W-NM]_ ,!^j%00%"/0ߪt~YfGhE$,A YYDи M{9߈ i0$TW}qЫFy^%hZC?ĴDߟxaDf/="XeCl~I%6aҗ9Ze@X-gxb̆|},2nLE- YD;<~.[ʼx5ea?H&`Θ)OuZ#G0C2C]elz2}Qkn/bPzwĝ a<\fj,}UՇxOGs.1aUC%x,M'< za䓳,XG h{tG9}]f,w[j# [7çzeC&HrtLI=WI)_eg3d4&{x")dϹHZV MXt"'l"=m>aesk 7 <8z}`| jS}^b=>*.3YJ,P7:I9H< Q48n2RnI!S^:œ:Kz4]<$ @zȓê|҂yP]g'@% ӁqC, o'k+Ռ2~Da($G= @~b>16(+֪1v܆u͔quxuPPp3P4r&npRB8ݩ@ t`d.Py}`ʧ XaȇC( Vx䈫oA6,!v.z*ͥ@]H >|Q7(6pa~:Km ~nasTS"5+Uw=jn ԯ+ሯEM? ٫Ǐ?\wakEd(ΰ`? }C{'"oوcy~c(=<z|7DKePJϧ+;̡IiH7)ui囷릥\n?놏zO#Vy jSޫAC}k]k7}(7UͺJ!AVM& /R!E13#}3R'o9B:y#Y wXvAmQ~1E=`^P3A׃DІYfBՋڈjfr[o)a!Mܥ ʒRx|J\NNm\臧G0â8jcnxuZ\MYYkqf0IZȔ@P֥i|5 ?H'3^gPi7G8@+-Tݪ/-Vi% }c'Z/tH']`~`s0ŕ(JESU Iz {( p(w,Z$?]e $~egmֽX Yd0/+?*mS@X/D A Xׯ6V43O/ë"5,m]fҡ3S8¥MeptҌ9iBDZm=v=8E0p5mXvxlP)yCí:z5a<;?+9yڅ_G'`#G>5 5ܼ-p7Yz),7Nw&t>BkY(::,:Z$Մ]+P{-np=F2IvBI5Ddǎ.o~럷gn^"=N\u] a5]ZdD-8Z6mE+ 9i0>Wh*樻EUɑhz8=Lk=t"~l_Ĩ9)|uƦ/\lʌmZ/r.|W2X"<`dhtU{U]]Ύ/+,k&@&3Q84|Nd]U-"s򓨌tg"1+t%qD~z<3~#>koo竿 ;A&#y]~x _68WWM4zg33?Yjoa}5}fwĮE}?ICiZZvU@xwWd.qn5vi%>^+NX⽸;7۹H[/ 򻰽m҄;t -U l>`mub6-ז4y$T " hJHDտ&?Q,Az]X0+d$;/eZ>@m<)(UjP68j52luVBDGF Tcw8z 2aN %+!B-<''{t~ltH.b̠*AYmsU*zřK͙)7D^$5Pon'A1ʅB ~kXV5K[ v@e|(2 J QeS0 4oX (U8J1tFitL9f N?g9*&δw_nt :xR u,Wr!X8cTA5#or]c鬿O  є`C&t͇b(QY( m %KPeo\v]ccLEK($9HQHM@VB3|ev-%Y[2 O;{KWWa ͷ.\b֌xUs)\f R, (PD+Yxnjk(ƴxd,xIȏ&邝Y%(&<9U8ᴾ SZZY r#*ac$X:kb{(t-0 Q!/` ^&'bmW[VK Z~9U'kiK"QN ]*]5x,`$N ϘI1=^>gPE% yjJpR7x8tJx҅'C]A='ct_W%K,ZȬ$X+oZ~O/QȲr=B(9~O4cFgos~Tlz.vss )ab1cuQ|,FPy2"l:ӆ!mSZQu9R@R> &|1L,U\w hE O>!M#>єԤ#~e<Ĭ=l,2Ҍ2 A=֧fU׿mSSѷZV~2h|X&~BuX"yvZ 1JivιR=́hwӋq8yR{w iz%c/~ a~xn]`cc ;GJ=G dcXj;'j6L%=@Tc}Kbcj$5)"qU9B+զvS> rZ/<+S*ga}]ppާIc. wtI`.b؊Dg}e %~?|k>>ԜRH ·Q~]dѹjQC# %$Kg8~}9fpV3_`Y2Um.h45R(fWJk,^H6)(|b?v~T#Ֆ(-hDsHW xX:g4tv?nek3~`ܪ!ZLN ܅6pD:|ԍe nDZfo'HCf͕QBKggQeu p.; d> RG7)ƺ!:zB t"B, P,t.8zT;q?lf^c!ܦ١Ě)p^(bMR9)fzBENT::?1Rlu쁣lQJ|+Al=ݗ.YeκA d) QiKHtzh׮ V,*Z>8RS{A'O;pR.j6T$FAycPo\ꂣe*:^?RiՄIV65]1/d+6TPA ~=p4,jQ)ztƌJ CCj&b#RG?l]|S aj3Jѳ" %LbH^Ajk.f`8f[iW}'C5j3àa<y0`c A1"X; 56Yf.Zazg`rx`%*Jp"*0Kr)p8:PIB;opB4)cU@Blcrتa9hn:m1i&}bPT0 cr6":Itv墚O!I{ iҙ=HP:gnQ3RwUˬ%s^D!A%,rK&3}UvN~X nM~p"дF r DJkU+T HԱQe)~Ac8 "UHwiU\KS,ֻ.8ϝvX` oqKA3ԬRzc"v;h~w߼g[6[xTFKc<#%?遣R"~F8܌aꂿխz'FB'4oNnp'26|I%o2ebS%7ՋBFizg^ NRɓ~fQ3q)sGoRR3Ņ14M*reQ,4߻x(>=cS({w遣e'\94WyҰnep[.c[*A <жTڣJ,ZRޚ.vKO>Ґrx uvj G۽:`iK]9 t̳0,u]p#IDkϔD><˶3.`>J|QƗOcV n]~ܔs @,d^:78:Xzx0\߯W.(^O77~~rt.j.r^s&FvF O z?.S=jP6^=NMRvŤ_daEH.b̠Tk`:hR)RmU/y;op-=]Hps0n_{#?4T?Ǽy$0GIB-^|-2\$iCԶ p( I&piJ?t?K@8] |OGkܬ8Zs-6"Sk)A8_sLZ#&XG%FQʅB 1 k֪&#A3vq049H HΆ _96s_` ݖz6ӶnZ0q~5wtop#c Η1u-4.vqpLG?_[LLϐ"k6XԤDF !bΆ\Qa{h|!ro+ȰM.e5{+) m0:)DZYQ(g8V%eIci,1z'{gԏ,j_no8B{p/Yuc@HѤ4NJ@4K9A ObtP1Ihma{;Ax 3߁,`}njכ~^3.Olgh1G0rABA)5=:NwRh.-q맛 Z/zƓzBw+dBuI]Ox(&Z2R,)6+:hVEM;z}ԜYhʉTFS"a`p[z .^>Q&;g0!(Uk4%Xmm{aL*]({ڳyJlzjtĥ2ADz E#5˺'Bg%.8|w!!G.8:fG,Rl?!]]}D[DZ=/QT~P4߿՘.8JfS؇Ô4)*>3f!C i0TX_& +S,/7Vt,Kv=K=~那лVw_7nŏWg ĚʈDa+x*c@=KA;U{L5+bg8Rbhz]pcϣ7ˀ Ű)-\ j!21~X::}y.8zW֭ިKӼ izJ+1zOU] $۞vS.\=ۆKw`2xzW9%BIg;3M]xtcs)[`AF46saS,Gx.8z(̓WmNɚ6p_~#bMŇVC1?q[+m AQ nwtaSI%B;LPRbIƒ凔q $["?DD콬F= >>V9pc7.O(3ګIBWrPn9`|߻M>XT}p'_3Nk;o|IhH!#d'8GBUN@79}v}奈c|v̽P&gGAT)oGi?NGfN{ \kFY~lX4e-N⪡ %V _`be:p8WJ>EQ)-ӆUأBI;Or]tFc[m,Pr /\NC*#0~2=MLx> ĀpAa:EO~fhA޻0bx ]94㓢цraz8B_' g7hf"N0"'=N~ |G`y0L.+И_F7&tLƓBIj@?~23{ 5߃a{ flV&D3%Wf0 G ,J*}Wxbv+Ƀ%eyُdx nvu0H^8 KֿuA'2LƷ6}|8e/I9Zr D_jW4훼ߛ!KSeXX gULڳ* `p<ޚO Dfk#D[7c.3lF aDF߿}ۋr}r84?9q'PfOxJ[7/e?n3q ]R(DtxIM3)Lin!n`$zsÛMF< _#ifH./J`-DCBJLc-B]pqR١lDI 8E8B\tJ($xw[>#%mC S)tRڳF͑]Ԛh)S)dTS*c*rZNiDK EAni6[tiKnhr|{sY9BRtp q)HbgOiynb,;* ЁnjN)-.obڻ 'RBNn}3|) Jh7S?lm),yvDI;ڟ))rd;c}~)L?Ϫ=~‹]HggMSf`m=ʎooYF4vR`4hy9腠it}(<5o{s!`DiX!d*o2{{Y%oCJ /j]b68a1 +ý:,ܡpEE>`^W޳hPr=:5z8cS^hoSX&(rږa]-6 #k-u"F'c>30>[u%9I{}lNyW%{GaꟂ-[Ղkm kkv",ֈPP̽@q8\P rW)W*y}:͆Z0iIbrX cE3 _gn>c_!hvh V;)DA=dCni5H֯ `8Y1Rm~VqOOSϝG*ިMN(Ҿ+M,..t}ޅ_?25>\zx2?ُiv_뺥lb`5DzLZKJM 1Bcb1ZS(cajI~W臌_(IGnӦew5vs.(fYfn8u+ohƺ`Y3 ATjB9(dUhgFʑD#Cf6H@V֌5Ҍ7 BX4cfY3p1:XRY3VH3V@3@ik4c]LY34R#H$"'J(k,C{lpLDX8 F5iE4i 4cZb\e͠I16RbN Dbpu`u1wLD*1sf4f4|4 kL/׫@Zd+B~J$1䗁!$4ίl!\)Vv4 RãO7~'z+hӛAqy\靷'v8#o}?m:;}uUzRMs3rҋћ%u!Ea-paxp_QG͠pE,RA;Xm/}xU(7o7M'3{6r9 պJ&PqTu\afӘl#41R]Yo] |MS\-Wsw5q*6tFiFҨ棰dfhZ;Pv&M8el+k˖VurEvabYs~MWH魟-$@!w[6/8.ͷ,D~M+ =A&"i ,[ƒ.B o>@ǚ\V~_u]&؇Q15Bd$SGLM0;L=/y %:)Q:ͱeK8,"6j%5KKbY))R/cO0`0hʻvk r-exlIcx1겝&Cc~#V>Ÿ=[2x9<ΕT9mpMyBB\*xqI,6JPi zs&P"Ԥ+oBcjA |s:rVlueۆdqLSg0N`X90xE)5RÐv,%qMvp5d?Z?C[1kLV] MCXaNIkuc8 ,Y- V1,Iv̟Duu*h6fYb]w q쎽c]&c 9B= c!!Lx'7r(hx1kԓJkryn# pf9wǸi*CR`6S(;eV>3BHJw7k,Hǥ@̰C%RqBIGVKЂ0#0`o)BnXfj JNA`;d4´+O3t> ݩYPX(mW(-LwyeuZSxŴd~B@_!1./6-b 8KK:Bz|Lqw/Uj)6~5n]2^x(t1CyZC%RP)GC! 'Ipy,*1LKL51JG41:OBH 1TRDJBRt^¬RPFdB$EMRbM(WZ#aH JH'cc G"/xha`5_h7FH#&/oo:T'W8EB(# %C%VqRv`S#l,1c}H\dR5!gN柯[HUͥxr]HE@:5,Cl2+>ivGYl\2mOl1\o6 P Q.܈ur7,GYr 5Yk~cdV-J iRćzTYUm/&0,0?N;=p;?5YchQn%![я'+ԧ[jnΞ`)_j1xc57ؖb250/ zuh>:),6-+P}:Du:}^X^|;{9~eⱣ7&533OȳWKG`;s_ٛ'[$:}OtpBVoѳ.#ǔϲl GUg?91|n Z@P|p@a&ϲ`g֌.əyt5G0bg?:9`ޘ1[1c)+8ssɚ p90F\,Jx<0T/oBb$}8t0|Crb’"M=[閜"!O?>E 1VX&hPhV\i)pI!YPm0MbG2BH {, AX8(be;BbDآ'20"5& $Ea$2a,DX*ER >تPpboVwMqO6:K) )1ۏ3y]C?mz]HWi%6vFA`ŀ2T͑2U@ƁLtL^1kK\ZU5"Eh]ԭ?.{'|O}zNѺ7U~[_J+VId|e![y1WhTUiUcߊҪ͛50w}<܇cuJn.b?dS֍t+BP7M#pTH(4! SMDI& DaӄI%SCIzxXJ*YܹDCeRn 2d^łp{6Jhp4`MY%hwXP`HQJ׶~$Ą%1: FBvX4aFMYU#2ںSi#Zvȩ10J:TZwũTJIv^6T 0‡A FԟWloa[OmOt&72$< RVx&q㒚uA \(:|Kx'PyL fRKHi#N`7 ƝP4\<BNX:|ٕY}+uMz p#L ~Oˆw"ŐPH Nx& 71fiSJA'?nZJ?CyRwa<UO9]Dkq~piIA>h)ݒJZo I@zzܖ3p(`-C5$@v Tt',GLCř힓݊1]W9y"|S)$n'`:yݫKKl|r[5`w1x)gw5\?1$yel;,RdaÓl2+o!﹏Q.E6Ÿ[ą!seo^yk` (FH8t1<\x 9 l^\/R=i #N뛯u{e;?R? W)8&G,r`]kuR%k㑅(#eXuCҕmֺ KvZqߎϧ ޔwCg$:U0;WydJ\ |ӋUppYveUb%ȺI>yKh\z@kAسmrE6uwIL.:K ɹr bY&8r/Ր׏ϟڒOWdӷUQ^1irf_l0\ck`~qy_4f\ϓ/ۀG{>8·[;\0Y>ws3hUGx~=Gw~ޙ Zy?$αɔ;7sXR@<}l"[b1O"WY$Xie322+S$pxNS{Mͭ5 H!DT6hU_؛Ѓ+?KΞ#ǚ%3rqʞu>a> /KxC V Z2T*Nwrѫ-~E֨䵣U6w,CO%b6Kc<{tlV=6ۈ +Ŋi&Y vW&_1$+)X ٿG8q//*Cn[>c#>^#)&ogwq+7s ?fmGMY=u8']<[Gy_xYHMvO2#^Mxb{q=?l]ɾ}qqcZD?֧O1~KQL8*dQ__\ W* p7cyWcec.n}?3R|OO'pxWp'ZU{C ldžɝP:l kkkg&js߇Y<7fK;?nG:FOy<=! =<$r*,7;_-ϋ&v"l}E.{a\(?v%G|ym6D*.IP2Ͳq\8O.3ӏ? S9z%R*+=zEefWxVm >@d08(qe#L^rMA-eȇ*ݤTGC"D򔈵pœ0&J%90=(TP:7RO_>;V˒ncH\x"J@cCq 4*-qP9GK8*rޗۛWԼ'> l0Rx ɹcqbz+f]K8zm5xon eNf2)O+k>jWO?nli#1NA4Uf 8;U̲VqC:7oJ`Lq=,SCtܺ&hʶ=~m"YhL@pQ+;P,kGNtY-|nG&˥"9** 0rSM0ZJőTRkܩpwsX83`(8DT):g_f¢Jk% >|,miG>TBOpEYr$ qM[r{}!f'A6 e(:dTm7tQ:Ii$t&! M1LjިE-Uȇ(Ӡ4JY. 9OlV3R`Dk2f]jȇ 0~n+ی+T +bʴ &.K8bZ9]~KO Pr[_7EIr#[<>M֛ž1[jƑT39OheHh0x!L)z- Xڏl }JCd[XlSͳwP>u=kx1_C2Maf&>ȲiҴ_Kf}3"t7ٹoԴǧ6d=qjQ>$< r%AngIPuGLiٍ;HΙWQ|ߡ͝^RX"ɂpG}6+)9g~P2fqK8JVjt%|FRt[@6l2!~ixQ14Cq'Sr颐,D*rbī ^lbԁt88!xoPU.LK9 5&&+XA:d[|Gߔ$j%jg`R}rrA  On0C+2,3tp O4*=x.d}ɢ ۣyFlgP*;&#d쁈#򼈫eҭ>dPK5.\`g5s.Ƶ,v FF18=Xؘ-jWުoSI!vȨ8gPץVQ9"n_g}EnRY’*]'"8i#箠5bmNȇ׍y\x;X`ÅCE!X7CuND#RFdDZCل6VԁYvPCUuE[|I9FMIV_v%XK4 Ł[%]164;8fu"f5V}ruo]picm}7|μfX>Ϸ|NUH…}GH͐tUG ĸ,b'D&/ZΨoMtEkxQw9>oym{G_oN4Oٯ~\>/n֏1N}]@/&`'rTn)n*Gu%QqR7J?i&jFEJσ!Mњ ש4c|vUUW^|]qhf+SI>hNvȨ8وxg[DG^P1BRz=&VL\iɭIkcmlIeZ1m# \$rhjkۦE>T#b4)dy\[酅¸)]BjvXQՀ)QܩMk&j<$ňe݇v4^TGr{N5BO~^&A%8Y,qb?5Fg'hTf @ Y `^2DDKi4j}` Q)GterYok d/ʸƒl}I2-+f‰(h.,E>T#ݼx"~[(a~Ṡw\/45qDm!QUxysOP=ϚSj̞} $B{)ĕ4?%tCw(JUݚ큕Uc3,$٣"foyD['5p7!,*Kek*<5(*t:SAqOZ ƤBb\`m}lP$$C1Fh(*UkPY]zo.juV8[R늆] 쬯6TMfow6XY 5 |. 2|i2BR!PjQWj+,FjLbxE}m0WU 5#VE`1A[VaH$ޣ@gOy(׿K%M~rw Se![t7~_sN;*K$00'Y ZbG:G"r;i7Do`(핂 :b $,=' HmGE\Ȫz;U_'T N B4QVDa/f׵[a'kF/Uz$}LCZ{ن]y^//jۖ*.a_@v 3 rE|G-<`s l0)8U0$琭gqǿ1;|G8l|lW8 [//=*U[ZZdZH3"0K/e/u<ڗ(*[*P-O;**zjĈ9.w5d!3~vU)S`sfX& HYIf l#orAlCAPCuO? a?49%'\-B,:rÐTZu-c1Čtb2/=oz8h2*0nԈH2[-ao4zMRDF|z +yEzK%NF8 l6d~T. z3j[ j j0ci'YoGOz1+2Xh4L $4p5*ŬmءP^_UڙN>wWWH-!eNRpW&5G0bY "U;a{~92j>h1+?/uN)ET( +#31CFCp7 l0[Uɝ>|q\l%V R}"f] |-tjĶߋߎ¼f~ofdmS"2iIB aJ7Y'lᶗqL"xӴ(a*pL[#^\JiKt}-H)`n;dT|mtb`La,5| xecSOfVXC92*v-ZV~gd̚OL6b>T~ʎbAEU V\`9{z nH}:[.qeӣ$I OKX*Vr'mܕvca"G0E>TFNyT?m'dKTͧ"wUUrhqj!PVXwrƛ5QWW\aň2f3:As5[h;avMnxf2 y|=,4!bvziP^2*z~˛|sN$hʄScA47d4Ri@<}9/ Y$1oـ\E6TٱfQ}kFQyhQy>I_ ͹.19$OĦtȨ91#N;]A*ln-anCF3 v5T>F >U_@ WQEkGlϗZۼ;\puipp!sNe qBL \(NZvbJ:ZvWwgn-=*0 ;dTQ: Ir1,ՓCF݉ܔG)/Q{1!Y{ܬxs#Dm}Ḱ 4f)JbJYAiɗ0wu'|qwuOnH_am*1> UyfSW&'R (Hę6DIHԒF ?bݍVȌ{ ;8C21v(qO;E(ST~ZzpW ]8xӫ6vmsshJUiGu$rpR 2,^lvF<ܸriN+*lNw$m|gYISxlbA_ӻtqI1m+(pAs3t3`\\rrn-b[O)Yn\)3Q`-t,{J<6W(b"tp4mv׬=)y]_jez@ ed3djq>TWCzej!b~׆QH1|" :/i0 pS* u&H/]rK7Ieu_]2QӧS G_+LYP\=fɏb1.*Y-{K Q){>!BOɪ~1쀓mj]';PQK{pt8-PF[1 ;/BPJ;-l%_ E,ą̂q41c؈؈"tɕ>&Oa|zd93{ʨiT}"~},#ǃS9ܑQUa0tLEiCq>5~a;L/@I%sow~R{NЀS ۟|,YS~JS+aQ9CPp&MMRߦb }֊ǎ3Zje` e'KGP`GGQEڛКCr1rG%m }4k~8QأYO(eu{qYzш_[OD4=|lb0GW_T}ʬ2bDâ)&)9ͯqx~m/D44cߋ~]PtT׃ƨRƝ6kI.ևWF O%aS 5:'d.qF],-s:t 6+0nBw >,-ȉx~ϺU1u|rp8@E w{2ImA)g?MS2{}4)-uJ( h# hM WВ3_V8RV#"ېKQ?1]db*f9UE| CuX?.ªU.(7HK"IĻ0QLBtMAIt‘΃a]FJϜ I'E"n<N1b˨G<%֫r=ӈe"֓6 ߤGs6zXqr!F/fݹ>Q*_WJ̶kqH{^~F+L|Aa㣷  xrbda.AjBr4Q;0n&{C,Uſ4"x(z ][TatQ3^ TRGe(1̆n%0=ϝbߕzl~GKԚz-Q!JzHT;ۺa6 ܦ=^NCp<+3r!f,c‡jQo@X,T镶!޸pP+Ҕ +rY;]xM6S/]Ws+T]̍z_W *xiGoуOS < a;!:TiXŤ7J1{Gݚz5!ƞz]sVMR +RJQ༴iRp4VmvFR fR2)Pj̠MQ1O؃EW9ͱEżP)'Kn5 18&R4ts#8`0qݜ>LXAdQĕ$,˔\ei)!(`Z& }0,$xQSCUbF7ٮOZDhTH q5 b<;U̎S 8Vdĸ&`ք4 OA.h*-l}w)Wz Ek|b?F hR4Cu?33vsah鈩)M*D ٲ$Q)~G %ZDkL 1ڟF/Ϩ4-sO_thQgENj7r>$>阵SrH$ $HZX+(/u`x<;P? z {S G28r"XuJsbLWS; 50 F? v&L`H,ZJsbC,-ZҟP1bt!+#v$Z~ilz9BSVwX#Y&AВd,Úʒ1nP}!Nj'tׅFB#btыtM_׵A<:ߓ'f`MxgDlOqأ8=?i+< u,90xi%, \}"Si{crRSoR:kt" K=^<5*"%^6kҟdZh*WJDU:mC^=0*GAGID_Bé=T_ )u<gMx0&|083&7\ǀ ɢr%nFec  }>MF}»N mJYV٤\ -Y tN.5{!un 0MLy:f?Q/ƲT0e` iasZ JvQJBeL>AC䛦9f"762D= ,FwxD!l3^fYTml@Jiak iW^%F\,i8dNdp&Rn8BN7_RK#5%1FiqXerHO~?v)?L!4͒0Gۛ?:z.ip"Q#fHs AweRF2hXJ!2ʐFdHc!#QIXU[RMnZN۟|l?@8c^c(Wb4l&EeO10#b+0BWrnۧ(\qUebQf͕kZ<Ӧߣ³2L<(25 /fyhTI5iCB */sEȖxS?+)BDU0H %%TnPlV ֺȹtF*{XXAL(_xJ+4.+p8F**..ec<.4F4#y @G[ ֐oUDvljjC̰{A)& v ばc|(<:x_F]˲'2IG_[hk/3d1 @fA등e9qrXZv*N&0>j`j/|gst_SWU:O?#|YHMܧZ_ZcSd).', H%I Ǎ-,i{E,tGrwRAegv=gԕ25uA1>qA %wrPۊyv'#FhwVͶ 8 >UɝV%goi~A9yUe6%J)1ZZmy>^[.lVZ~w^e;|ի%)_ CzF/ܝ {,l]TGKnU\Wz @,|oy.}T6z5%%>`VEYua4$]\JL'*_g?F(JMd7ϋ}-O7bQN0D}|TZi=ZN Foqb>2DZ ~ qAKaٿ.; fʤz 8{hIxK!|!l6 ʛdyL&Y͢rUߌGٛ0Ԣv C-q}П.Z'b>3 W?uוKEFT4ڈ]G߁۔f&C]b{ }X,r`eGg6u3cSR/sGmg-|ͥi̖Y9Vy* qҷmիY+GЃ,Ԥ{q#ɿ" ;lzߏ$/ Mvӣ,{$ٞbuHK\ɦw.0pbW.AVlD6iƣWr^Od^DQa[ވ@_ ȂuRrc+ q!hPrǩB5F ;ǔʹ=wR:Pl%\ݵ(.jG1^ 1"IK[wϖJ;?csAqLL6HcyfXԐ#v LQᲰvh.wb">h@NY^Q舏4"ʅw&J DB٫*ueP!'&ƣ)܅}HME LЉjbjwTs@N&zt<@LlƜ 1VOqLL}vA-:[d2Y=f3LjQIt`b0b$`AU1E6+2*4&$YO ByIB 5Y4inbL8W=4{{)*躖_S  RTӘKA cɛ"za|W|iO#MQML-n8񅩲+3\^]:iF7]$OLr3ϐ@ 2F^(?s"+%aa=p|pp0_Jд ƜOa iĨ*ܯ2[A/HYrԃcbP3EFK3@JI2} ֥ Oʚr %Sudf&[Q)Y)TdYUƍћ:|PJ^bOӳUuGR4#[uXϩE7f: nm:58#{xc/" a/9znbf?HD1M 7MNrsgmRDY\AcFd|iٶd@LNjOb|P@6.,29q$&5=8'VS/_-KB5- $J@Ir]A۝OLr/#o'&ߩx>mψәaP Fd yB98&&ղ#{|A'|9A4@l _w8js%(X$-1* & fQ"ik ShcZ(G'=8v=8&&CriNc>>j9mL(}L1 ,2܆ʟFqLLrg@$}8Ȯ'&ƆK5#۷×E''/)P5t>/{ *p,)ǩ(maYˮ/Mc_`Ye1XTR8\*Bci̤T!ZQV`Ra@Sa{ŗ*6և"B\pz ۠9ԗj@܁JOc85#K|>pne2kETgy^ LA΅'I5(ك?^;Wu_u&UӘ%Ǎ BWAlA'^z+<Ī]bbs?Ĩ.܌93 \N"1LIrQ4,0NI%*O# 0KB%ø%43,Dj,Q$6+Tzmg P}jq"%e csNҫF^[fӴViYQĖ C P~*%Q^撼q/Y8P@R r|޹ `уcb(:B G] ]ToAa>c)Yt .$.5ڇdb`,=#:zbj6[$#t4Qqs.,o6gq‚YJW '&|ݡ2$mFb[11qY_4O=AFzpLK&,@%ә3DWfEsaTxbm4([gFW06( UzC2qWO)cfIMHr19FtQ'4['f  F 9Y "d#.Z x0c w;B܎ `p0c99V Gc+^u8j*foP,n=`@ni7]Вlö?&\rBH.E(MqwkwOoOkʘ哻+'c>!yX)a G}܊oH߯uqq~w[=KP//7{oݗo~ۛf: hWG@~l0 !Aw=/~u3LӱzO|~Lj?u a@e8֛8~9_ZK~noYwPO7Ѿy엒ekF`DJ0Ƨ;%K͞UG߿n[~Td2 1Ϝ6AĒe^q3c1<SNs!r,N7[z[#]R=oEXM fWQ9 R6 ܭ*q"n NqF U^U 2VMn]?~ܖЫSa| |m1*i40]᫛g0x=_Uk}qo{_7jrjo;{Ϳ=L+6qz>vq-Fq.p~K6n巋.YG삼E2/x8ni6ցt~x]oAvɮ#>E|r.J'>ul?@};?É΃xsB+dP(zfTpJ^^pTgkoW~ cv'lւ{{t"9#%8! J ro_kwV0@!Z;ַ:)y|z3BF<0 |漜FU~#qPKI؂ 2W <Mhewlyq 6W5&Bfjs~9!qW';h]+|rr,̟TltAFl_ᵧR=PCn[/= S:)!)iӺk06`<ҳSTLlƜcB01D(48_\2u ڬ!a~}bfOj `Ssp,AEd4HYs=0PE4DG%E;頟uO&MxqOѧ?-5NXrTX/2_\]{UӘejJ`uK +5@]'0jS'+ԘSo; 7>w;dv3 x7tr={3/ W[i2F^(X d$̰;O"sVA `AZMc`-QeX ,kdY.lmw+dWrv v'b5Y6NA-#A(T#F$s)skXGgo v5t0tVӘeV8ʡl1l &WSvVr2JƯ#hYp<9h02ciFed!h"Ӻ.άG 1)9(s/tCoDNtMhv_GVA'w,*uW.pM (߁ >vOcsmcᝠ y~“_Uvj2kUggfĤrS@a85B{SN0$Tfi); vHp6IYG\W~V|677h|{⸋o޽9~MYfƻ}(WchkSl0ꇷlO>eM̌ufާi#yX 9KM!:d|q#W.{ a"D^B|Mp#l"/1˱)[m>:raػ\e<-{%i ?_ uorheٴo-.t}Pvq+VrhjoM'f,~?~o($hro@}B`| .O?V~ۺ?ro'jԁ%"0{)ldśSwyOՂ <ɐ1S[2Fmб1U/oCO}aκ0~8foѸ'ÿ;E$8IHAczj47"h*;BULJ;C68@R,'kkZ6nm/ghF/XW8bR RXjR$cs΂fڀ 6|mړ{[ݏpv | 2fbqC{*]{S<==*j'l'N!Q2e+S`&ʹ.+ԭ> !IfTIޑʠDR2\NTɡ| 2H̹:X'eYӶ*51$ v-/,.n>~Z'pʮH)#ձYĔ"ZG?dL'cSp-b''>x er$ծhAM/{8"J8 fop QбCR7⠻tU :x玻 9zp|g!2a:GRVˑ-LH 98?ώ nеC7i!(26TD+^覂N^{pTֹާGG~ޯ?QFCOi,!Y{j͈%J!$bgPd,!&8c}#K`2'z?po'MsIV(16d觑5<הerw $5$ gD F:0!]/VhyKQa W JWW ՃL%Ub-#u[n񷑧R2epP;7 ܏SB'o=\n5ll>kirF1[ޏ7'j$E3ߝ(M" řQԚfė@"ڡ#9$Z+ e6O: cBmŅ~ H9\L 4˧׻a-0Ilz2寨)-EUUPp ˆ8h ,<Xزv-:k8 唯oJ!0yr59r6#n3ǒA4{{<A;9/8|zI!C6Bٷ׸@hmy 珏]>;9WQzt9 e),2HLXIx\k"C֧y8wDq{zhUMW6[a7w0V^(sQn 㴲n#uGkpQ˧Ml;Zr6N;bxk DN )g,0ʨ cdzM /3O^1"VqtD,զ Ϋpi3K\~FnfV5=kRIy ^V`u+.xB5r{#!gENzUʕ5tϑ33Gcq7!GH݄â. .\+/qYre62kTͷ<[kȺiMmXR1/ 2 Z[j Sv_yƞԙ1:K[S? WѼ1n'4ڝ6[j#Di seʣlnUo.&Jndx`(>/fP:h> ,Thq#Mk}=Kz[پZyK]! &iy]u'd'l~f}.ssrMwD2o^AƱbqnA8rq ֒ F3@3G6I_&FګEk]]iS)[t>ʻ("0kKP$9 sTosn5p%W %7֐Ȝ ofzco< ӌyXϿ=c 3 Y:l"#a Q  d$0#2 e܏gЂc{`c-֐RB9ȖQ=8u|pί=PJ/w@9%OE_fMa[`"bq'  P,Ҭ'MT+Odai8 e݇6('iosxU'ݯQ\Σ9 _FŸ[ *5V(X kҁ^S%ze9Gk@O$U(LJ#8VUj 4m@sifȺ/YU֝ f݃o9 oGdG,o[FFhu#SU+B"$0&JsE9_>Do;"48ݻVzLZ`qF$zCFW2 C/?G ZmdbݴZv}ج^MG@99090&F 5d%48IFU9g=ENw`HP!%-3Q^nIӸ;ᴮ9J+.߀Z H, FB:iK@c@o@cE@X3K_n BTExŻ6BuLHaxL|׏Oq*gQ MQfԢrpW'10[V6 KE>  \a#U$wbqh1sIVK.ds~/. FaibeXEJ̢uQ,yQ W,0( QiJtFظױ6xQ0t¸0r4%S 8V8bLz!p(~[wp_sa_UͱD GdGlgծ:RߟwlE֍se-=Y.W.XR`ʂs¥\8Ѹ" @:"08$R 1~uKD$}DD (pbIZ83aUv-og]OG tj!Q="48T??-Zp #auh֡*=o%08Tuqe9ȴ(_TN&75Ъ9w(hO@Xre+|U؏R x- h<&06?[-鲌(.* ,Z{)5 5. WTZ'2ZM{ a*+F@~{=ԪC)̘cRP I _/[EmݶYg"*nQ؎鲿_&OEoVae1,ZH$u r"|hH dt5$*kkw!NIi0 D-1]k4ZY=V^[m0.--*̍F$XXDap)@ɴQ2`Qa9ӆ#K YW_3ս|XTe %( Ngv1.#6axf< ^S wñM)'OHN-(xq@C(LҲT94Qj,PtXDAh JP$xG?F,b8J3 xe2qpMLb\_f𗸬[Y8>ī3ZzNa~gs-5I:!2TN8xG?Ǐ?}[<; ݙG07"0Ek,{-{%UFU~6 Lyc0H}LAR{IKYヴNXJ#DQ,|i?m0Ѕ~ VqHV7 ,K28``ª+(ɿ,jŇZWcHMam4:mZ+ғ+)2g&L33kOko)͙u"Z+l)H]2)ʀ=) ƱLE梦TT}:ۆy:\ҥp>`5[V?bh6w$"3 [o$aR)ۗ !ŝ@?znX]T+58>/6?+ڜܤ'GPdAG1x֎>{RTF+Zf[ [ٜ{a0&PAp\?-bQ h  'AyN?jt:zfa](H{J[Տc"okKakQ=(en Oݠz`&<͆q>I@ӆ愊QH*aYkxX" w7vcXn~vxFMζ//uỲ,c^|,:k} h]\ ~۟vfCX{cbpӠ%C 5IƃI5XH/_4a؟zI{H?LF!ԝNgXĥ-q]ɶzr2٦ks/ZH#಻xcsב'T JEy.qNb-gms$նJvh_miXj.vѭ=ۍSfkwӗ2W3Cu4}OkS;9~ƿ `q幄ch"\ #RkRtI*xʹBGd`#(r(ލ*]4J4JsF-R%wEBZL25O<(; $OZ|,॔0"!xJxv I}'Hn1|,Pg5EdaSEV8s rqY7/"H^ )2 $dϳwx)I'Hf-@YB%/&.0-q0> YB𪞼. $)ͽh| %"*^ oA YABF EB$'M5O<3Ή0oeǁJ"A׎i'H쓁JS$J|~dbA\ҙY 3w! f1Ɇi`>Χ4OfO0dn,!x f8ڡ" ibeQZ딴i#EBC]S:WAS Hk8"!xH6_b73ZjKYeaV"!xdn˧Pt"!xFI6¦>AB*VA`m!k\( VBW1)jh)rv r"S$/ S('H8<Q<#Pe.~bz.w կrSʈs퀪),Dw23Z>6cj'6mRcV#eoO0!D6IӯcyƅwA^ݰ>Et ԣQ(pDJ1To _}HQmN.!0Կ![Vt W*ҊWCi.1az{X3#T=JrqTc=R59&a\ ^pC..>w>ޘ8rEbI.*ΠY9o/mElb&M}WT+ mD-_e +* )&??m&00aCOKVoM܅E?O/o?89 xZ@W*.KȢxߟ[x+Oތ`\=yLfK|]}8,>ZK0-3O (b7opEe;,UޟgAnm)T;IcKͳl#lq(|x]l?!\lJa^i sT fQoϷ{r]dn.oc^O"_m%"Ze/c(e.Gt;܍o!NBhh\ +Q.!zzr>[e%@W 2"v9_wL /C 70[vO9!DS1u˛K/aNp5V8d2H1L5DR)ϑBDHz &:[@@RPJH"0ίtUιk]"rur/qDX. 5*YYZ&s'!R%Oޕ(-0[4DC)mIpUb]d$(8~ʍ㇩Tnk>}I|Jڼ#MFu2%Nѩr(A\ns@A1q"TcBC87SZSy:0[r>Qtu]@TJ #PuȣkF1swMPcaVR!cixuݠ&(@!1[5zFm{+qz¢k0x?3)MĖC0+!#Il %R X #+s~0ڏV~<œ,Ed/" }QgO8 (QH0x em ZĜYC,B {بȓQ܁(NtPVB Yz2[F~GW*A;T 1ǭs P.X]zQo/6yNXRu3;`1gIex\h;Q@3Yj^ȨkU&Efa=tD T:(l12a{'c lba4a2NRwDY?|°YY-7:s =i,Km%"N jGF<؂bOFG1c6??QAg}?,9\;8O1c$Cs*}>H3ͧ,~]X@g7MgO9s>$bGb2OfsD|ykk/Pt \YC#ŜUL%L/2\ +VҰRr%!Ŏ2}%f稙4cA!(>\k{r^ʩtLoC?B@#ckBDyM.d,[}q"N\q|D;HB3kċM6^L3PMڈ#vi~r;c$(L͇+OGeI g9䘳ѧ,&E{C_jhd.`淧f?Ne1s1{Aϐᅗ)n0sH[]2.YextO)|1JUp;oB!gh(Lhe9+WL#,b&<%i黜/yrlz㒿З%^ܝXevļ4(U vT3>/Vz=qե5WiH2S[=_K%ny-#eIvo94Ku׭;s9BG9DŽRn&xTrncz ]2,JV Ƹ#aRʕZ )$%颸uG;m'g'y3$sm@=D݉ R;tݖІE^F~@~BB+*VZQධ/PH%*PB:(mO.q;r$g`fkt uxQ߽=k +gI>v>^FT>SPIY=C^C3/RmNN 9VLFwacgmQ@ 7ߵp*M]yk 3Ow91[Ml\M\3 q;atEcPڅTR1M[X{JCC](uԼ  Nyk6Sd {&C<i@.DpCL 8Eva㺭C=./nm=8E7%l0BP4U7&⽯}.M{6(ϐT_7+ [X'1V"YߖX=\OP3,C>L^= &u J-ѢƗEvLaLvb^qŷNK;nE]~~tb )ݺ?$|θ,~vʛ?ݟ b9Vffl?F wqkP46 eRݾmțS@uϟrK7Rڧ[Q{wq"uZNcz I`RԲ$<"zi?Rֽ?^n+PqiK=F1'c-琉p$(VvL [I}&c4fXs]r2fXݠz!Z>h.G W 0 8yK:_U4m}7b+ucy.d-rO:5oSn5мKlۋ8V㉘3z8y7~RL^NFMy󅹟Δخ)>;8W[^=0 HmH?Cg+`v=/68SRy:z..^2() vs]*u[ %9, ;YE6|T?*ט-aShq?[`PvgȆ= bڢ$Е0)E 0ûlA.e euAAk~m.O 1v%.L\v"Iaٟ j{ \v߽~uQC_':Dj`rɃì, 8mH0)>&"vֲVk0ӻ0FTuC忭he;_]?்8?se3r:B~32û6eCj òǹN"sgjUȽ[Շ+%wU`#yEٻ6kWr*?J~HՍ8ytUR  jIӃm :)$0|g볐\mhClҡRĘ-93cqڂ} G6իbH~yV)UC9aSk[>\3me##> d<`ZBYQn4nu"glR!J --% ?KurQ [wcliKu+`Qp/qDX. (%2ZR(0/9Ƶ #"U'‰ ,YyddBa㥤K)b΀gQ Ƒ Op<#0e;-] ސ-~&~Gw}8V{dsf kSBjL>|My<~<^b4PB S!JW Uƪ}w7I*r%J D^rÜ%Q [ZcI" ,Y%zh׉hXIB(Qn&x9Ȱ, Ƹ#aREbHr1ѮZǫ\Ec0MȊڣQwg, `nä5qMĆXڒ,LnM0&M<.B⯺{-rǾ|Ird?~k u_]fmBڮ\j Bttx[,uŻ0A]R}w=/PQv0]ů Uh;(mȻt.݆K!!ܨ7,& mjAfO}ыb X-?C,PMwVei+OJMZ3(`a?̍Y}hFޥ▱ n0.x ,(fdrHx!Vge\lVf$*8 p)BƌXg>cĜE NVFMfLM{2io+HiU,5SElp.T>r+( s)閄 ]WWŸRTjwx?5Z4֭Nfm)U|sx'_]b?ɰRPo1SL=~Y+q4X Y_ݬ6A>e ծZ4y8\䴍g@d'4"ו|ȄϡEżUmK0YCL[2zg՘ ^ˈi؀ ih'Ŭ-{Q &w.U(U3%%f9ϏrN4oou< ҋ7:b̤D큡gwjL[XwGOr)ifWޙ :AaIݎ.jra;.ٯ{vnV“)˨Qލ|!{ȵ'CJ{=@5C`&x?0Wmɸ8D=܁g_y34]~grӓfL_8g?˺[f@ .H]o󏪳TGʡ̍{{t׉䮺g0Bݽ+H3oKA#.5Ԇ,mBVzB(kB)0fTTHP=A]S%Q/(IݞvKIg+~O`T+1şVO؃Xj/sGEfjy0+݇O z|.jq˯~{u̎iՋ#xQlL|[aة'5V^K=ɋՎĶCVU(z`~P?@SHLORZz8g+sÏ Z\- ȍAB-̸SɛiP(QfN+>CqwQ)0񏊯'p4ß<~|/? z8JQJ~#=gDڊHJ Fxɜw,ƞfՔ3p,gΤX4KmrN ;ejԶ:`v$(YiaϽkmsѨ\z.={BziWC棹9&<.mGrBIgEI؂Y i0|VS+v<9+wTV*'˛RӧNe5:Yݝ3e Rg؅eJJ+-=B+v ѫBG-]O,3قO{5՘*AĦ!IC]h<}I]ySvnrOd*iKR3j՜ۆf9lO0h >sN)me :}:R_)8q*܁7}WN=y{⋑ 0x J,w)jNg>z,)`.$CW6bv²:U*RkwqVzՉάi/=ZP1ލiÂ?wˤp7g*\{)gZncL)n{~jK ^Ȍ ^`AMxd"=u$VL>ғ;@NUA6\,cccZad(QHLd$& YobA% 9Pg H\nhc;iiim"k V%˳c]})O̬nGߜ~ThPoB_< }qڅ~"WoKOwݿn'\-',EPa~5"o7=ƥ_V0Ѭf/1ZKy7IF,B0#BXYL]SFD a$EV 1Ϋ|R޶P|ƒ>n@Yk-ߺfNeSe77O7e|nY op; l{FTu{̸<.+;#s 23030/5:#tn. 3bʘؠU^M$JMA"igOp[鶳H&x\!jef2{B\n9ze?^YuĿgc]|ڝL|4\+lؖW:'Q~2s^d)D;6訾GD,6":*ӔLf݂x웯GU̪RMXI4dlj0n eC@y6Z88ކ' F\>O'Du;YA!bzQћsE"ǙD*ME7wPF. &c]q09v֖R2|%gS4a-GȲp;0Uߣc *~̶\b!] ؚg{8W$f~? 8,\AF?%)!aU1fHJEgD9~TUWP0vY4a#A29HysQ 9sOOGfL§pޗ f67]lE)f nC6'SQ6nx 'Wkɚm~\97av-K28ߎO2ٔ<[M̟OJîJ*PdTtF tVC#A*AHL)sS\Ydځ,\RTbb,(3xn23ŃX1X.AZH[pifCgMMTr($r0A'A-q+B5&4DĉSWyqUgG|[~ =~)`U/xosox>SY$W"ŠQ%WҘ֗kf]IizYlLR-@`<)e:@ss3Z9$}Hvh5Z`(3*h2:ڀmT 5C乯1'.x\g'+݇?pL {8FC 09V+뫐oS}q!vuXO]p;b@o. 4;PI3@ЎP#q#BC' ;vĝO 뎘pt+2࠭sj:R T(hWl} P5}3ŰI3&e⯒KuC75^;VVix(9j}ZG3v!x 8Ŝe6lQ7_.ia&*I]?]{M-:q%'5{)&01t3˅TXW=_y$߶N(Q=+6K\ G=__IBM8J*>,Xѝe3"ͧ0&)*{`X9g[CR wy+DMI A7gŁaE;~rAs_ʲcQvo@z6\LѢJF2zn+S]Nv9P)3CѮNmˈ;4$~P2ThYAiД{QS?&=uƴHѾc|)_DOD،JnOfjPL ~WR]W>aeImUӇDx6uW4}Z^w (8j]nO8`A4K !XNh ݃hk_l@tfpuSQg?XED;QgEMU'utVLР>謱"3BE ƒRb }e n[q:+.= ʸS0pگi 0RFbNDѿ4K5Z7;< w.6]h~nׅf,7BsLEff䳏3,;_O64<3%WoeQ#ƶq-}VI`V-"G[7/[D`A 3Ʒj40 "5cz))ĸK Q8ZolNMhV /J >"rrM*+pˊaʡ "/N73+IrLetwYyHR>RWũtCz.Hh'\+/rm5(mpWE]*3BVIYtWIӜ,exT%I$T[om+ɭ)U ]VkB!9 @Ɯog B?O/ ꣸;M^|ppyUҖ5m+% jo2+ @ئl! U-,DFBGfIGKB}]+ck{%hwI60W^KԽ\#14l88fi?)h$h @%Abyn 18_"ۜrC6>T/js= `iGR*cpyDT(R{$W͓ 3/L?zi=Adu_<04sMTa9%j[ok1K)Z9#%+,M(އ0 #ŷ5;5ذꊕ~'1^{E%Rkث9~E\RcҭI5JzaD ޏ uyE0O')Yr}L^,b _-X}\i/>/"1% }|҆lQy>j*o=7#*XD _ #Ec.+ںo`2Oؓņ#~­/aN(EsfM@rYo7p OSTp{Z?0{x ;ҽ鷅!{o <.G.쏸o⥜cqTQ( cZ)"Xc'RHo҆3V*d+Jo7C_@et..s^Q6v :E$im~;'MݶX_v`o F7HM7m6NgWw`oJ*~*O6`1ӏ4"?bz3ZmShiɟfYMݟѶ3j0ޤ>IM uwmE5ܛp=忏I+h `\a( +i A_7i v?#ݦPv%ɹָlOORx0bQg<)V2""& #(aeLDްrMK܀e:|2auh"|gYz!"WmAo[q1VYE 62( F刊`) NP :dZs₞_ŦճzS8^c; yo(tqg~l?}>ORӇ~l?~l?}>Og~l?}>Og~l?}>Og~l?}>Od`Xub+[_c aT6J!۠LbfL|T:ʥEvzQ %f{f;ù Bp!ͭq,#ئ Io3Ƅ8񠗕7j[I[D!f@: l`#Ax vVk/;F%cm'68εY+*;)Ů)Kym{u^#ekػ9HHQgH "H(0?yMڸ0DIQN)93ZF1'Hk#}jiMN!@gOuXbw)ڇQpsk%_WVɷ=hnm\`gaX )?'Û(W2:e 78Ŝ[nu '1beP=P]:yΔNkV*8ʝ7\ \bZcĜE NVFcv]`VӫyON7>3A_ʟg)ay٪zǁzSAw`H c?>xQŔ4p*Wʜ 5l L=ygPQgKGJPM`dtAX I$Šr*5)lg `'ȚV\;K?r oh/58n <ӑ63|b]"ϻࣇG`k?/ 6; N,gZ\QXiePKH_Rυ*C0!B*^[X1c2b=6MVHKDkK2zb|p/m4`d)K\ϖou 0ڙoLWbb'>d^i+>m.<ȝCgZI·M|YZ3;#YtזOi n=/(0` ]˭|UD/"pf0g=:i2Ϋ.ן[6t:k!e׭ͲClY`ˢ]z~hH=ϵ\O9}}wּ ipUoMPcǷs^;#M ]kB3&_[Z |_8Dž4q7I!j?76{qq8ل,|6~|ͲB(؎-6L[+=%J`߿i x/Be\}ds&Xp&ְQ)_x5,fPҙ#-N<'Dod_<>"IJ׷S鉻f){ٓȲ~f{A!\qC~24D;R"i  2O}$\k҉"7AǵQo gO7AUFRBruH\k 8l :%*I%&Fx-wm] 7qzJ74jGz' uGVL\4O3Fs,T^3A%DDD02W r=e%!N S0P󨐙B$^ȴxƜ~X\Ge)XconjHPNM򠀥H.8!(Q{pVQ9 1ZrZi䴣0EWGa<2ugž<V/oEx R xbVXх< } L>]en/ZμP_WWWY=t?T3E7_NSgM ynA80`)kLHڅhvÁDVHu mm"1uh}J>Q8#?ގ#~DYa2iaeclps!2̜,k囃 qC 4Ҡ&(ř@VL ːKU s,PY|A?sZ⒟0E*wW,gV?S fQ5-[//Kr le1frh _Zxe[XŃ=8vQiRO:R]yӼOӶ yj |JvJCrK&V=/!rØl:Pᎎ^ƌ w<_ߐ%duIqv)_+[}fV wBHo{MC)wH,ry^kfE4 h=F -~(F[rQ.Y[{ XwdkRRIJC9]jukJ$.4hy8.)-f} W5_ Qg 8U$`HbZrTr6x5z%TU2FK!̐l6LHͦS[L8/UXi~h|]4{iyO ϴRKicp:ʙƤ\&_IЉLR1=$ E˼0oHj5eTxݡkapQY@ |7{o;0OI/) 8l yaxi j6FRQZz}f8vMxvp5}i9zMW~ԫ;ؙV,J㜍.&' ):!^F%.8chM%3\"B#~TH#jiYm\D"FQԮ24fatLP0Bhw΅8Ϥ 랡TV+" aN3 R)ϬG@I{0\GGV~0u‚_oB6 >{Y(QbԽgc׃(/)1/MpH7 ܄$/o// s)$yۋfe/(b=n/L[L Ԡe3[33 cÈһ;:?FYny=||͌m$?E:JOgP؍V˦uB:c3zQяg H+Wժ'uL$9= ըe#aPN-9RsݟxݜĞ)?$ʎ䍣QDzl~7/+Rr"%QPy1{؀l4\^dQQ(_͓Z*FַhUxҤ!Բh@ uQsՄ:}L!n-lH/w3\)qIÓp2t64~çegBb^7nHh-{`cP:/u"1R~"9 = HLg,K{Lő`\j,$sOeQv`Q()T-jݙ4Ϸmg5y(m'cߺ^]O/ARz7.eN/n$jGzaX0~A1Qy{Ӎ=ٽ-NQY?tu֬Gl H8cC3H|DGr,wٚN@l;{{c?^"<ˏ?@/c< c8~ ~k+6Kc#/*_}臿uG>;yܟO  ʇ,CP ӏ̑nJ o j7D6! 6? gQ2kiy\@cߖ,!4lY5=*t7RڝEcV[l>_thW"9P*FVG#$K3J1& yb 2 }8S콣%{XPK}l $o=)eZ{(ꀔvŷ`X麸L"206"A)xZ}mWSko~]m@@ڲ5z#+0ضHkOHˀ`^s[\ȋ"Vʵs>rzθCęgwW߿Ɠp># {;4>pPŮy ֭[TּPVX)]AS.&gԺs4WVXqYaeC˻Ȇ K!!1H&R"J e8 s+IF]y=I&XUvB%gc2 bq;  Zmؕl6DN o?\Cb#K^y(l2e&f8p r #L@ZeG"t\K1pB$J $)Ť R"udQBZBu9P[ "}$V(<L.:K@4*Y &Έ_sCͅy>jOѠe(wwZ8A*|W XE*5-"T'p™"1T|~LXG̒F J=> RxjhP)Q P)1Z+fSm?>ȮC8ַW3~rW6ý B=A=F7_3GJX//HnD<נ+Ѕ\d5%CA]q p1]{ _X9IJXiJTL7V;e82y:( F?eښOkxdRliXoiD ࣅ{(ѹCJeB NP&@ JP\Xv1$F (̊hXтзaLlxa^UyK{)޵ЧX~0d7-wɷ~ʴ)PW=Kr$6ERf*:)-T'TQn>-u2(ɷv5 Uw_Â]&pCaJޒ8[jݟ믏U[:xY e%1)Vz\U[ V3nmЯKPBz ~ֵ;Xjn6.=|_{'© VwET^.u:SڐNgzb Bff[%wq/zJ%x/IejrCx M6pW/6C%D(uBB,ZXJL!0"~ >39m?#&̈́biY)qmfBZXH>`[$%\2N:Ŝ(arke[}-Љ?QEzڎ&*TΫ2dOn` ^>>oB Wͭ2x0*+ּPɫXJP`zD*W{!??^W{KpYC\[`aj3jXp2b=6jj4BZ"U_+ʏ糇jxXTUsR6m02\|C9EZ՜ ֦vԵ%>d:k'IZȝK+'sSӬU]L8.`tit{z 3rn ?Ԭ&,jn>.>y5xuA3}|ݧV܅+V`螉[^δ= KlڵpC\7]yMu}m -Xե`?=ycylchF Xls 1ǭsLj[(HA^D{|'84߮p3k u?XctӍ AX,[׻9[L_DءEiшФRbD|>v9Xy8:a&H k.(qCvY`iI(J1,,VJ\>n5c;oO]TrP2zE)+S(#d+8>jK ^Ȍ ^`AMP`"HaY{+ơҵaߥT͎AtuڧPhu4,XP42XaD%$@tY$h) Ӂi]ICl9ȅ4*T0u4H&$ZB`ܧ_*HcK/c! #$$8XG!1z ",iP%LR#1sQk ZKZ/iǔ#Eԥy^ ~g4; Kdx-gy zG9v8a4IӵfJ#j_Q:S딄5߽d>!`y:{ m V/,;;2Qaja+ItߢeRsݿ\~Ef9t\z)F],rBས{YAǫ':|q-_!aJG^Bm>T̑alx_6x%yMQ :鐱 n[ϘRb%yx);yҚNȁNzR^OvQPzYU(oE7S$Cl˷Q7tFekϞOB.voaϔJ8?QfexH)W0vL?}Zhk18>AcOf[낖_ZZL~HHh "ŠQ%Wr>L^I#b:AsT3ԧS4#Zp3 kqYO~?kQCV+l%r,F;f3"8*$^3n]uIXOUo#*Amr*HmS@`R*VZh/(R81pfZ-HFADDD}:ѺtVĉJ d9ӘDLp:5EASĨ;hZ\"єr9{ˢV#_!I,0婒@`h͸>L'J,Ւh77tU{]ʀf[\Xl՗^-aRj=Õ+0u:pE>bSD0\w@ ۞«ԉ:/L҃xHt|$I1NJY\u0&A7l  kͼGqZSoYgJ>~XE)x<; 6-`eEfcixPQC R;g>a5~_rԫC2*ꥋ%<ſ˷JQbdR0(8YJI2,6H⎂5j^9/w~,&1]aƅk$ i?c?oNqVM+RFY(3 "yq*AXN<P6 e9˂ҁ jdZ/<]L$P}c8 Ã)!q{d(GNef{<##\0X&8`_ N!3JT) [/tGq S7zPS$8nH1tN{]g7^﬽‹Z3#QFJ6l{찶؂ )J# 8`Rb&fVȶQFY | \K ^#2Jt<r>1x@R0dJV3@0 _ơphThBmIM)RYU4c,Ȕ"\B6LD Zb"ez;'0C&61BV,"^+H(Jedl\i;WߴBˏu!܅zρ6vdEi_Oy)~hn@+SpM߼&9" &uDhQ\jw1|"9$ƎUAٵd^R&poL%T:纄[cR;ĬWۋ^ wE%K>;z0M~k^uSY}z2nr1mXW:ކhlO=Qǃ9K&+X67l~\iAo{wxAߟ*힗Oe/'1?Vh4;VR⽵' mڍmNGxH B~lRU(nq]"GIsK Y,C"ZH+QכB]WGexI\(wp)Bp jg>LLj9+j$Zyc{0oMz)7]:Ҳtظ.&:`{w"AmPϏꅤp(eǥL [Hy %,JFKp{w4g)*SQ ϔ)xXZ҃ (QkaEyA3i o `'ȚFsL(QmGs+#hȰ, xʃkZ H2e[Zņ>MYvlQ`,YW.A˩r{^ ࣇ'T:mh`8c|6KE$0G:%gSZ@ЄރзL:y! oý&,:]SCA*MGT/ɫJ]($T#"S}Rka^=uZq/qDX. 5D ꬌT ~Ƶ #e:RV8XU"M1r^Uo/|ᰒ|%tv(-oUИ(Qľ"Q$(8DQ^xoH~*-׫C+s6P솧"Uԧkzt{0p`rs+@ß.x.e Appm eX1c^ˈi؀Sv+%"WNj'w'v :}PE[k/zpa{Ey-n~ tJ[t=^͙amjL][SOjN;towE\ڢhZ]15 o2Y`ńv7kzȥbtݚڵRQﺽ]vyͼf30Lvq<^YK'3qa;^SKd]_w B1:>/(뤉[|wV-!_z|.L==Eu̻41 nr;M5+g0̒ u?XctӍAX,[׻9[L_DءEiшФRȄo?/}{LU&H k.R!O>R> ,- T)Jǭf[f <=?wD<_{wd>ZRS\SKUԤ&fbz(]]OiIY,J7_ (h4:^,N*0G][s"9+9KGlLlG61uDۦJH*U^L2=SҥbKґ0хT9dAX@Ѳo"88fSAbP |@N@k Ⴑ.J 6HcrpKcoǴ (QHLd$&F:łKsg H\tZ N;$5)2.̶7m#SMȾK}}7wi\6/N}C_췡 ntb:q?NҸf5_kr1} ysI73ԙs0yߗE>ˆ^FcD2Y+냉5eDD jFQ4`pHɧ߳ώNܸ7m}uOugD['N]MŲu Z0KMᛓ䛃%ox$_dA;#"-s7;0^Z /$ւi( :)! __p,YxbYm0+3ŃcZ~ZHzWC$UmRRasn\̤msȸӳFOdEʝ[qI-Oߛ:|u+:y 3*9fcy}Kǭ3'9 `4}z9kx)D- Sr"AzL!1Lf)z̻x6JK:M":)k9lPYW<7\l!_)i$q?R8/9a}ex4BbNʪ*66jl?d\5a]S\Mu͚D]Y@Qg2᭖/1c4r"0l[66bѦ4ŇrQb&y$,ViǬQFYJy>%(ykYC٦cO٣ђ?8ͤ*PuTş;[ClHnv2-qѷ;P.w)Ҙcq@[:j7`IĖjT؁x DprIJ"PLřbܼ#RFǤ"9a̰4N+a$ˏ5A^AQo%8ˬSTa|Y˂ TQkdV. 'WT.QbdR010QJ0 R)=RQ'ᑲHya|9o}B%30 LiʬrxGq8 "q15yvC}{f v% K0b[*:$wVo0I޾$c5]}HOO=t;MgU3 —Sb 6Cx똁| 0Nwz;j)ab6 əM?.}9_1>L,mh8N'I{hVJIУ}"\\ u[gbޝ35Sjdd\woͿ?t hxwߚt[d &keu|;S7w]Xet@4}0hH5W*$$.s/SD.FZ J1uf2 1}]+a ;P3:%K!{XȞ8#IN& ,/_CきLMuAD#SE>Lmih}V36 0$[x6wӛv5~D}xNckF@@@ ̨nR~>~\9S%y! 7{uNw1ޤHDg'1yO ڊ2ڈsv;cU٠̋ cjpHfR"9 = z &{z*$iCHR I"@il\i*$WM)L5G3Zwi >X͝AB.\.(NF&sk߻i֖ꇵ]/R|7eYN? m$*[Gb|HmÐax-`DZF UL"p8z0f7pJQ6j˳jXڠ9&a8vkP8a$}BM'r,uȆNN@l;7r/~>?h]~{0+0]ES^yskީҋkg#*Wᄒz/穢| xpKq!`W`@{5IlS "r-|Ab:绘[au\P|h-~0%{+U^cϣP2Q2 HjʖͫYզͷS[ϭDk+)V3 0}[AX"̀#8RBrr0ڄ%4 Kh@[s7&hwg$1YL 0sBob`gqIQ+`@{(X!٣`/e໫*l׵8樐LၱXiT T)()$Ť$T^ /Po=q]O4ù Bp!ͭq,#ئ Io3Ƅ8 7j⤀-X"Ja`y 6m0 <c (Xe3l!ي@:!ϲDl8(Hs~v,zy6ilgZKphr,HHQQo4M3L1MH%7iQh PV8d.hHUHه_V%]*1Or)(7j<$udXDcZ)"V"D4>s{S8: J_w}_ p:)| %u}Io`$໫>W߼ > _-Μ4u.i.[gCD|ި`>-#a*#=-n%z{+׽ A@9 I$AW "ŠUQ%WҘ{b'qEWXs"d8"ŖͥBzL ҌHtL(AYd="}㯗ƻT}PB S!JW U^nzsja=1#Yo}s$kN'n󫺾_X XC@$*5T,%`0 ł))HK MAg Q`T{SzRĢ2dE )GbTeŇ)@~\eZ30浌F :eKBZ"2l1eD#Bt|_{[6!f~{z1<n Xn>6-]ϴ]GӺY %2ɍSkU#XӀ.&9~ʝZWM댁Ŗk}CY=/L)h{gt̓E[:, w׳JěU*;/GkbR!8>T-'Ffd j0D.+p32-ӻkinŝOܬoT~&"zF3L`xR$* %^FHHL P9rȀȁeE.qVq`2 @<#9AQN& #` aOb ! #$$8XG!1z ",iP%LR#1s:Z\̅ N;$5)2ƚ̢*6`[PsO+\HLJ#*愩TRX2ӎ@x{O;KihstyoD!h m 6D+b `7iD`.(<06bUʵmaDg݀E:Rb&y$,ViǬQFYJy>%h{kY l1FbA'tqw)IIUZa9-nwmmI펄;&JjO/q qM I9VN~Ë(7#ǩg8zu=[!6WKARGxk[gᚌ&%xe)Go./tE,3\$jJ.Ȍ&sIh^*]B~<=,MDaMdFeAX6Wr~ɓ $#u'?}< E{eFQ߮̊+:J>h4iR +N;2d,bҶe'aM[n,Ӵ 9A$M%>FQJdZ oC&F%:K RUmM;mBS6I o,F\T) ^Zț+q9OR_sr&$4<~,7a]yL߿,1QɟC.+u)zz?ݛj$S:Kdȴw rJp θd䊝[#%xGtz7'1)˓so0,yEDNVIzZO7AxVhgR\WÜ:?|L;G;n:WV$S*e_~>D/_/N?"pD=F']4`M d+.NJ՜Ν5MLn/aFkr7d0v2s{ mBX.+غ&YV. #/hYޓ :(XX,߻zt;f;޴8׎*Qg\7ꚵ:FG]42&`#_S"wm{z<ĆOO.w{uA_~ߕ__<{po_ㄥ)S^p?e+)еŹ¯ߪ:\~ݏo#,Y07zKܪrrպ ҏ8JgRv6DZBTD.wM)0M.y͸X V!@%y-}}*y=1[S gEe8yWPo'ٺ 3x )YD_V "u]<\fϜm.[cʆg G䏔"6$lY >ztd Z<с .2\KS@BqH  0M^nB5 uPm㉯o M"$*E"*Z$^|7KzٕvQ֚1-.qf* &|TQH PFM"(yZfI]*m?t>l.(1rcJ9e}7 $W^opƏ'"1ieL8*ihF^Ґ+X4͕AAZI9I$CNQ2. F2甭JAsN`.XoDpJ"I6cJ}>iB]> Lk 9dOuNdD>ȭN7pP1gru]u N(L P68S)X0ƨ^Ge)u::Y\% [; d7t 5?f:Ԇ&Z>XFq>IM4>G'`uURP)LEw2(mH,^Hooq:;&t<e>T9-{ے˻`hol[DH˂`Hv4ʚG-MTRvyh@]:'U:W\)d :* ]Z:`Yu􀫙%$ 3yL<ࢊd:gUcܐ[ ֺqf?F&*>y䢭f^ݹ~(nJEho@=Iơ^NRݽRnRQi)D8.jH[ V64r8Nhx:fxUcz4B μ+dΝ2R4!,9aE` (X=X'Z L8] FU&G6[O}B'N9\"Δ|ܕմΛk72GЌ;L%2oƓ[#48xI ZhW1ޠn! j8= %'~^D|m`D-ZԌs|q~fd`y\0"PVJa#ϧ1{@);}v3s3?m`QUh?L>|?? 1Yv ~z.Y^Do1XȞy":捕 +ZeV+KdbiJN4Cn؜S֝4ђhSu}zE<틗o/~%lGw/1+o+ *K!0!( rl 2n9AO̗\pcҋۋπYe;TGY(#Iɗ r–,;KO"29ۏ=.Z[(dJ2[B`ZP*&-cg)H&ܗπ Ef,a@'ώDO΅*=)[4R(t>䐩=Iёmw~{[' gh*֕\2Knwڇ."p6Gzt:qb iH{TE_%Z*P*F+bP&h]hi=0ϊ +?$Jy=kc Ok#e NhI:Avܽ"!w)Ǻ !Xd!@L!-("g4("Yn1CHr-j*p|VѦUimZEVѦUi0M[h֦UimZEVѦUimZEVѦU9g~Q{_ԹKtEmmqn=oC{ D I-RfHj\(">7ȫw .1 9q5ÆMM.lK6K2: `bP\>;qYf.s>J#`lCNmUYsaURsI>5~w+=bxƄ&E`Ė&r`D'1DnъY'30 yWLd%Hn oӠc6-lt>;,v׵m V0W6= b6{bjMf;ÛAw*¢׷!uv9I%1X{V\?nj5M&jT39+y$d*K.H$MspIg 7җϤ֌Mg͡FÄ?3*k_tzJzN[&z?zhF6+@fE3,L9PzJQ4 FE+\. r@Cfh ?f)@qNKhB²69hȅ ( [gC#4dYIbݏdoBZkeɘΘhJr2Z%rJ  e\anB[Ik%$bYezUG?쮫̤6IVt]ΞEy6"1< 8v,qp{/@ Xp|;}-Q1A3%A%( U:%τTrX_x%8DfrDQa7axvrWiNL[?WX+o=\,VnZ9HVei0FeSfMMpEk$JRU(E{MM1, }g6]-E`Fb;'5ghSaj5`â֬0Z<w++$V0N0Ǘ$QԒ9A gGyHA%b2R2l匽9pT0єxxMXanمw\An,&rmw`&'8 )c͎sܺ.vl:n{=6.!2yfyF_M2S;tr4\e| YzL,M7w~ծi՟aW Ukq !F\.\ >o֮<9vtf8K\o"I]jDIKcw{ݏ H>W)VSx3՞*)w f=^L;‚'>}o <P| hNt1w0Bͮ0>]O/L1Gl?\<,⭡^轤l pVHID,SO (Jd.ZT2yU`)U: gf85c2(PR v ^^gZ_ssl,%crYW(%$T"(WyL%'dcH0\Izߤ]ee\~m0gr2R۞_˽Y) B&k[{OPd rNɹVSNd~ps2"xy[5?"JV S$Y|K6~8m,E0S/fב{7ϵ\43z%G(1BS+uf5Kob q-gPobC<مʧqI2ȑ>!m*fpԣ$+Bi^ 's0OY7Ix8 $_V"mٷ\:] G?)mR^F#$EƘM4C>3$&&-q4m錃nU_B2wz9RZf*9 ɣ|{ 16"H&08Y2;*ruT#E/;H=:zk塳 [ߊ+޲.BxV?nC%DFf *댩 `\_9y4N#hB%ٕwɂ= qHD벉rSQDWWl[t=stCK L0MS#?_KAv}ء#iV,QjgR "b1ʋ(jDbJ8+,d;A[ gƠxs Tsߥ:˧wk,ô-+a ^ m8Ϊ wr?b`Bu_ v4Y^Aqq0Mh{ fmˉF{f!!-$DGZOå@z;~^QЊX*˾u)"MzuC1$?eue j0:? J]Z}g<@(M򷹂E&8ފ<JI9؈Kbuy,hϠ7m:^}%V!Y{mӗrKQkUӛNoRoАH1[Vaڨ5gU3%]FR@ %' SU9m p[qn}oja텛G67PO-3Tcq?> N/ߝfN1;jRٔ{[x<BB6L#S/H/# ¦_xI-~+L!w\jϨ BR %%2R/iJ,g:$V 0'2d:WJ܀E0v[#T ,NX(T=u,ӯ#Ɨm_e#MCxJDoR,p I,8iIJ3p҅Μ]FyJ 5OSG<߉כ^ øb0577X"^>/gQ8.89D%aZ2 r"PiFxD'"eID"fdH`KPj$J"$\? } {KE)RrgW3UΎyfk1a ޒxW7]g[ksw/gs햻5sgƆ6f޽մn^;[=Wqh( K\}2="=U@Th|ǰQTA_K>JIMR<*cB3(,1~כiNIOq~'Ym_$_o}ۈIJRylee%U&~񻝛{j`Ԭ}m{K?{Ƒ_K(KX'6O1E2$r~3|HIMi`6wmDwuܚ9)np0%sH[]F‰uXT|~MFn(;be,:uy?YBp[{?⬾_:+8Y  tF(P_},H Qndhȸ@a'Aʋ'JtjN* VSQ_.۷ ,_ܧ`\lY}rK0zS RLX`+i1%MqQ,>V0nܸᚹ|cb. ?fX$}ش꞊v_tn.~NFzy>H~(* `aXk[J +}Yұ1fvT ˔j\gg3(v߃[~sގ7y-Sp!zτ6$)eb-fa`؟q`uS!VmVDP-#JU\vaJ9Ϭn^ RlZ3}7:·=)~Qb\>l::l/ǨwT!uImK)ыHէSt~c9U&ō9 bؿiVzuX|4e-#a(Rk[$q||K--/%M/Q8D Y:pzFAjNeRT& uyJyf$(ZX,)vGE`.0 |# T)J˜9 ,+wUs MSKo=)*_Rӫ^ Rys͍4NGFȐ dҡ`5\ђ*,TklM N)|SSU2 %Z,p<)& zSNqA[ pg`̙ rKllI$?+x5KRT?2%mt?rJ?+r9U4203D%$)@tY$ R:T9ŋ6ZNr598,r*U30! GA D97+0!|`#VG%߉Npy"g!}ϝmI ``ScaxA'`K8#u);1R^F)u/>H#ofK7};S~~qag+ZH\ػN׮"撟oZʷ!aIR|'>0k qᛋ,\Ox?.E(bc2Ryΰc 9SDZ J1ugf4 {6= ]]NM_  ʊN[lBʥCRb) \鸫AۻJTzu!$fo|)zS;KٷMW@^o͏0L4&љ[V?ஸI\(Ń=M]_OnFgG˦r &ρ5;\ԸzۆHݸrycz3vC-Cm%*[Wb|Jmːexч^Qt{)G5; ܺ*AWnmՖjXV(CHZ?tw:C$9>EC,ɱREּTMZӫ3rꗟ_}ߧw0QW^^:_J A^}p_ܲ3UzteeuLWJzȥ,ijyH(o1=_Orۺn3*5wX"*,9bB[Kd-uCuݭAMh W%ᏩD icKg/lI*1%F^48UF߮@[FU[`,YT`#@mTD'5Y?4c(9s@!  VJEƞG# #i∌<"|L)=ګsɽ#SIXV1yHuTG^ȹ&*jœk5`7Yok1KaP\d)%uTĎ5Ă5T@k({N"୉W N X+tSճ[9vnh8m&-k,D4v q!>~^g'nv &uD%ԖhQjcFǙ:clYca\~ƁmЎ\qpJ{PTR3u-+VKcJg*`߽9lF ͽI7-'5P7 rHr)߭٭+^E͎,.P U ;qV=hCU29z&dH,rz. oa KW)Bf,R5+}lAAbwgۼ"[ԟj%[hI ;[Tݖ-vjh3zCSQLC\*b8G`* *CA2%v3Ń(^ۥNUhthQ Bp#kvλB3bt u>J*Mϭ}w9¬9+vA-y"y 1)?jp5QMD.[MS6&4zu KD˙o[~P68gjO%{hg9*L? ԭ,C#WGSHVtmn8}S9}S;}9}àp ;jn* = 9_MɝԪ7TCV+l%r,& RgDpTHnfܺjZ @`Ҩ 0)n*kc&PfTetۨQ1N NPL29889x 8Pb[eً×MqqƮmM~]q,Dz85B8L{y r35uS0lcʏmKG9 s5G0'^]x# aQFJ%`XP'FQ0qϫHa8iT6Tj4vCP*iNxMV$sI7AqGچ}yN:M4"`єQYGÝdj׈S?|CU@m "L ϼ4% )1i+ V3ꀣ0VhGdTaK-,HR>B6xјI&rgh{PW&K1 sA90ajC=B0Ɔ{n'*׊Db@?XRLlOH׾hwVVh[mA'w3 u`9OfO~ q|c8Bwsj2?.T—[wԂ?6RIfhx B#O{kK"$wЖ;3t٦)vBkF;oGϪ^IH_2:_Qcw}7{sz-Y g5=ncN.;uϛV7Κ6w7\mCQ}TfU_Y@ZF7MurxH3夁Eqo٘z? .yǿゐ />#\_o`&Qeat;RqXtMS2`K.-2ezf-VQR ,E՞<#`{4‚SO:fs^zаNzϧ8lٻ6dWؑ0`\xl ٻX'/kS$MRzCġHI zpZ===UէU_ORDazpX`t!+|))=ӡ#x>WFqrʎReˊ#%6?wCm Ců4߯Q*EWk) x2ίWYu4:uک>UǕt)U]TmR\&ğt>>݀̆: AG6(?~yfH,K:د>EzŜһ]ܜEv+6ެTͶf6 X֣A?vDSb/^6h%)a iLPJj>WWi#?&Ju#x(#L#?% c-$+Ab̙5"q krV/.,3Hmk6EZ6#mYl}2>u >r@Q(G̤fy1eRLjI2Y&5BGJ3GD#S~dʏL)?2 >BP̟$W#Jjf+Us3[If+l%$dVJ2[IvYja+KZ~m$99a *ب&aGMrt>ϼ,]Aj~&'5/@&@ ,D`.(<6btN0w=މ܏S{-j O^B {UWcSE}V!z#%:CSned2:F*vK۹A].W#) EcRY( |iLU!ʓH&6M(L yaϵW+VXVt ZL^HF:qQ" >p |,-ha`j<8bt/(\xiTJxw gx,+RT*QB=&zL =qa'-ثnW"'R1EDä3ZZ1Xq}N2ƅ_̸߫!J{*rgxUã},9)5N' V "aKA)ČnN0'(88YH eڃeU6 麳ibYП%4_&U 8 9n:CS,̾O~tr9\Db2m`Z-W=RZͽQrk,!%B JaSgᙦ;Uf_pݿ+ o&̓D\o~u/vnE j߮&ew?^a5݉;1NM!ͷ4/,A}J3f1~咽=ɺkMwmxV]i#:nA|>U5`b'EO ;v,uZq4t*@&:^8_?׏o uo>~x _`&eQ!w;o ^Nݺnf˷JoW3{7D 5UZH{;9ʬv8! Y'*]0|S1D }aCK_a@).\i(>ZiMܯ$ޡ2fKp()Bi2c F߬r5m%AF=$UNjcc~dÿ7ƗEɹDz`Tky4€?2Q2 Hjrpѣ#%,JU~L:<Ӝ{&皨= sJL'sok1KҰrGJW.X^Pv|[o_Oݶ2F٧Ϋ:?ן~qreeUs:Wx}2)ыHx\y%ӑtUȐ dҡ`uJ K#[ؑ)W0ALS07XV&v"ZXxb),[\R3m4ȴtQHʼ2{\="m>]CB.\0lc*' da `:ʅ<"''{2!N:`$0R2-1=rZ+GKF80ʷ{7 :T2(xNMc}J;0)&8 nM[~Ձu7Ga^xE IRPFThʨ,pN2pWx2 qCޅ YSGS(TI 9Bo_f?P`^k nK{-XrugM(Hz->S`[R}/{T- }FGG=.Tf&["-}s2,2s@f9 3KzSjނyWnC̞Q 'k瑞B8$*8ʝ7\ 3pqV{t8# *j$> ݕgk&l^f?M ^ߪ's0bD6=bP0PT'goPoᥜ}b[@x!rF`Rb S*^f>,7GRU.BD!x"Q:+5!P`^rk0Gt Ch5V6s3"ا4KI S")d`ij(a643WPgݒ2dp/#a3|s/;L7&#.AqDiF0[4DC6@+pJ1IPpYy[Q)z{&Ӕjo౸jPB |7\QԾX4R~e-x0&&'V0yER Ӑ=L)kPП60G=xzL DQziLEeϋEASĨkJpCKpaj3jX2b=6pBkD[Ζ*h]MY TQ[nߔ7QWJ/(Q`\h8Kp)$X[0g1529fc>qlN q78?Oԭ5nJ5a#PQ%3\N@Kg z{ґ'(rD#z!Uxd!R&f֔1тFQ4`pHn-Gd}=i\Okן&Qw8ۭ~9 m\)v~lTK_sy 3g˲d9J[9{MZೂ,+iǀ穻~I{#J7x"!Y/C/K5sff̊y8\򬳆6++#+Z>,/.L "ƲMג6l5l?ȖV6l\i]i l-{X2RMu?B^yc~{C-0u*z͜6󐦩wuOmU2iYgoSVʸi<͖e.-;B: tC|U<5;Sޖ'-F_^w) Ax b94ZᔍBxP޾B3CaՋ@:.]J3::C 2ʐ*C 2ʐ*C] ZѽArzmJ]($T޵q$/0Yo@6 8atwUK%RKR8W=|I{ ؖ8Þzp&,^wM^y=kC]_6ͬ~3v &$LVHn"x32Ecҁ"~>" ق1][7҉yJsb'3-J97߼FX|]^H9D C8cBt%lca_Âvl}Fc$CRZGxgKIz3M[jC-U+;(vqkDZ4Qƕ0rkIb;k}z_:K/*I OOy$4LH2]䋟uFźqG<{)aē6`Ù1 ?t(Dn%mZ&P܆vDM`*/Ȱq5^عn[G#\|;h82r3߃`@Q[zupFW30s+g1:HzbUpSIǔ (6.hDL}ع;sJGiQ4սGv{~_ <1M밡l%op68&K;XTy ŧ΢穖D 8Fl8ކr`CVhǃJ*FO&Jǭ,1,tf4f#s:pP.H`*E0ΜU8*dRR5c5rvk((g QyC(d] l%rX$LTC²6d$\;D0+r$?! @@!@BHYKA = _v2$cqƌUQ+}6KJ"Hbr aO+ѥD@\(+*T ۰!IZ'iRZ b;^eEt0IB0]?fr`,mABJfcR2Rb%Q?3\Ao@WNwqd|3A`F:HmNxrs;bKsyXc=7^}mxuu3*~THC҇dڱlthHh{ixu4:"v$m 4ZxR Mt5bY u"EA/-(]mw@[< lzl2O N۳F0\y 1QG>XɃw%$c)if&&ȅ` S2:T Ю ޒ}`=jyRm'mM#R.I2(mj"4_;\گVJZ@#r 15D";֣ RH_2+[DMi'd =|&]ߍK J7d8+$AAJ椬V%%Lg9 XevoɜqOǖ A?ܑ >dVMO0WHSe2R}:2ݘW~{#=Vubˠx`r/o?}_ӷ񷟿w?}}xw~[:`"pb"Kѣݏ:xteBoU~vSIC4]q]UVT3̹YvТ ;s0 `æ?fg4:T$_V "&ş6Ga|pxgrl4-$fsιx:9hufAʹY ztd V?j#k3w Y;)3n5 !bu=W XTFK3"ѰLFXT1A"ZȾ6$XZ &w) 2=I(gXB@ ] Iֱb7YeQD$Gs)-X&s V)A"s9 79>Gjf@{4v|Qco')xw '[^0ξNFhBB^ fy[w!MHW4Y{9 F/J,Yٰ@mSMAe }()|=-)$9[ XQm%D, <)G+O6׭^&Mf)[ |q8(XG FͭO> $1e 'ECÙ.5J0:G797P\9"%2ʅ@B %9%x-x4N򦬸ŵWRs} ^uG}jϕ9Li7l7>TQb3FjHw&I-0頜#O'zty Zo#y>8vH8HIg?"qӏq:*ҾӁ7UnTb:& -bR"ZtPc:{UgJ瀵$u1U<&2zP$9sJa-wVfV@f;ޕ$Be-)4@7f1ża)Mj-/odIHJJˆ%ʊȈO>YEkXf^-]?x7%"·PHPOkEqIUeEs\mD$Tht[Z')(N-^LZKZPMQbMˢ (5J3M(ps,8eFmk]ZH=ӨJh{}ó#b.]oeB@1Eb\ yA|μ^ A**vYŜIA% Lp^Kxc 85&,m}Ē:+`ZaEr۾Dj,pC{>V0A|W~waWq^_߾xp[DIO $LS!rW+pq9fbZ9x8"q5kHnbǀDQ(; _DΐQAYD1zOٺkѩqPv Bۓ r)a<˖02(l ѵ@7h+Q:ea *#YZx݌u 'qYV::,\m1NFgm4_J%‹j<؜0AKꝫˇ9 RTY VPH2Y|SK @ਲ਼+ *-gR5/Z(:no*g)'5T!lD~$Tu.5&3L>c"|P3͙BrQ1xΒ Ŗ=:q8FrPgݱFSs|W@_:5bU >7 RA439I̾X ʤMJ3=$Je+Ew޾~qȖu Ye .y:kmo$ͿCt+Y` U"'*Q cd9"֨F<LKi{D"hDV(<  2FoOIŇD璀W.x뜡BD)PmOҎrv,,6Bxx[G xm]0v{KuFր疮0!.#r"FqM1pjfL=»O=s$^y6I#uf y&ޛnɧ;̼0r3?v ͮnN0{]x=eh:,V<b^x 9{]k"6fwc^)G}e- [o%ٟog|Qi]eߖcuκKY6soNA Heȿn Y+pvڂ sy\\Pn4 NkpȰrhfƫMp5b`=,%u[= 1A.!ģ墁1rZy{$HmDIVxB{9PCPg-88pUT3*)%aT┡>5?3%au *PI֌ŖC7((g̿5}#h򗙀y&D>$&A) L"&NhbfS2,ZB 6S,;i}=k>x6ns4Se֮ڡŹ_ :(0 VnNRnV,!Ѷ* ;[4B/w8854͢# m^Q>z7aJдryw+k$V0'+u+ ru~djBqje//z0;ٔn[ԃLjiu8" Y:.t)´@MNI˭[M">#Wf67V):: !^y SŽ…3v?MEl͜󒦩wTkd8&&sVh&2f| W >1P2Jxe+ *a. Z73NDS ..p!;&)gңt(j8(NqFY"M i0ī(|D92K.RD9% H# ܁Y8QȤh!AŖcnO{B}^IWhZ^ ņVnSs,A;rzG@(G2@LWMT3nX& iRG/eB݌rv<aωNV! L .@X hˏ4Z*5C!^~^Qw%]?Y1!iC:6$jR@+F+ y %RbR}2MQrkCSʼndk5 +oe̥RI{S4g9O !q7^Zfћ3R3)#jis̃0?1Q??/\fWSB ~<#źq"wVi3 aPN59ЊsݞƨbTkH9+yDtrEr=/j [$HbOz?obN?~#um֊dJ%OjM|#d @@|4 o9ݣXx ڊ\nJ՜#3MGw&su\f|9"8[̕^rB[nh(BV~yͥωۋW;qx'!66 bAْGw^uǛNƻ2lkóX:ʛ q&#SߌPN3Lv<) *LO.w|oOoS/^e|xk:C Gpָxޭ~{í׼S0[^V~V9~x9;ۆ$Nȝ*GW@! _#].\ކ5Oj$*EŒ w!).l}. }}UU xh wk˖n҇҇V1[S %YI2<Po'p 3X! H"DڰC;m0;)}N]R:$2M@r 2YeIJd"B@ iª!3xA2P3079o碵jBi<3G'KV M"P'U'XP"k:ߣ=v+;/OwV:[mQ>̳23T\m׍5V!Re,@%`*@F)kkI1{LG=F;'լKy=<ORRȒޕ{ɼ7d2klO &jq=zgLP^*FbM?K-gG+ ('eN8\`oqaf? R,F0t2j[tt$t$j{=3l邈X$ ZZ፪%.gI)O0-$z $҇DАH Oђ@U?xи.TzXpDWX _; Zo[s$M2CB+/ LKn!σW~Wߘ/%hQ3X"PUV$-A[F.8< k-Q7VMQǭQ39I9KwV!őP!XmckPj[G0_N{R~o$R-}66ѧ':;b㲰|8-v/Ov>uaǂ[n{|y6ڕͶ AZ Ma'a6ү{TMe,:u}? 7Ś9k˭#:F / um֛\y %qyå U 4&՞<#,RFXp4`M`V`y@[g}Tlږ;|ze:o&$s0n9r GSqZ *kҁTu➂䮇˖?BKZ N (Qk!DXQ|Ќ6FYC,B* g ]H/$Q:ܑhRG{^\3gWvYws|JSq&Y (a%q0QgKG (R8y (a%q0Q8s ZRQ8Ca (a%q0QQPW (a%q0Q8D%Ḅq0Q8JG (a%q btgkߛ9Zp.NX=sP%:AvJÃ&B cbGK@oׄa 吶謜ʮ oTg+- BAE98wJ]+$T͂#"ᯠ/P9Pj$܋ۺ=F+#%XF(AR(0 6`|j)4d5Za"{&CD9I S"i\%0l\nU7ZL[Uq;\uDWʎ>6-X7,| )v(-oUИ(QD"Q$(8DQ¼Yw޾;l4[v_.L^=gŮh:[mo^IRԥN/+9*O0[BlUKmjLC8}h_~_K5MH8̾kLj*"W, q÷n'As3 ``5Ӛ*Y`0 Es;'8qO ^r6@Φ)AI@F2- F빧1@,*Nm. r$Fe]V\,AYApP tX1c@^ˈi؀ iȌuN66Ok j;hL~7L -MփcWP!t#6&iSgu&L5)tВEe&|FM"q$ ,+滍`^\6@kJcaw-פoI_ɇ-|Ef]_ -ԀLm p^nOݣVIu^:ǔw7.ܷyVwwfq;h釪fV\ATQY=0nj:AW3%࿳iCUo ~ɪ'>P/ըPo=4ӜbI]^" $`G6YͰ!e579\b[jﱯ#l:zc^AF7!} Ag'M> ,q@x77'pYQ> N>X`zSEmqtt:pEDAjNeRT& B84uQ)/tSԀ"jxT8!,$S{+%R.bgp5Nng128+-j/!:{"N4߁^xrRy4NGFȐ dҡ`5lђ*,TklMN)TdB}%3=)8A[ pgԤ|&-u vW(㩲Pe£?]:qB^x>p?] OdKlEEK7q땒k'%(a$2Zʈ*R42O@ 6N0+D{ACJGt`ZʊI;WQv%dv.x֙.R *rtn Wm(hu ('Ӂ}4&jMR V!!(B3uS/N @S4(@m"LR#1s#N+V8혜9L1M1]>+şuө4,E',k1qKp5X[3g152G+Qyh?NVqz8J孱5>߷l0} jÁݝgƃ0s43DJ[oNP *Us^b-(?}DRtW%߅3pfa˜ǘku:e*LL`*BAxZ U8sH2/R` bu4D{R6',z̝vX-"ro0CJf/h buoV.^OL1(_}:w4tS&hUA*.]1^U#-1%0:FbʿBkqr19~ ѯgѡ)Э%G?E~]W*5CЁ#N6%MnlF!5%z梒N6 /JHgl.EerRQ5l|aq>M&5J^7\ H/\x~7o:C=sVpRTk/Ybqcst5[9!NqQ\G&PPVЍjRHQ\HԱZ Tuj!j;5Fղ#[h⬶8+㓣l[/־"R|Φ%bJ*[fBҊh4ο ]r}jSJNC_qkNCljHT)jSz˻w}uUHgJ1Ԥj唋ޓ*iZo𑊱-];e-[4&%&@^EVj XLgR8b@r,qVɋ$GӆpoE5-QR9Z_,2bFgO$)=-8lW.ZXJ6Z)*NjTu5Q|t& bs)x cyUE!eZ]2)K*բ csJhKߺ%1k*kC>IlS*9()#:rh)x"4R-3+om9m@l}t\.O<.NmӁӥ{ ~=e?z)ϧg9iֳ"fLM45Xk@Q݆/ 1YgT,}v)tǔRo.Q"~qW"~upO_~nq ,\mo?kAT'G=ǽz89zP«1S/ãK9t}!jyxݎ/'8{l'lσly_칭{-8My7n -ő}2==:(kzxvi~pZ&:V)pCJIfؤL&B]w\ KZ%-;"0&]mB¦[L͖@c+meZw5+Y+Z,I74Ǵz竾mp]v[BPIrx:k2f܇ya*@ W8 Y?4xP-{ZS6/h^JOh *Jc&tW:X{bH5R4IPc.F=?~xᆇJ U{jNsfbeMj}&B" 7Z0;L TgS9*x!`"W1y Bd7#}Dp<)U%t ^R#TO/3q)݋wֳi+~s1Y'6LC j$ż[jg 1 VM/?5.謸!Nuag4|ΏxyǥG'x+~I]Ogߟ~xĴRiuvVOjzXk"( HmLgʅjC&N+["&/D&ފXUET3ٜ|Kd^Y mHv5=5r)kSE ^aOl8=%}LdkZ$IYO<9JrmOjXBs*usOͻ*r;_f,B/^Rjum>uu*rhV7feUyk|%4L.V/Yqf  {#Sn/Mv%ҺNaӵV-ўe%󓞭8ju;?!ƈOgsb@$ UO+*ʚR^8;b*S„SS4E7Ĺdo5z;gq9ąj.t$j"5MzHJɡ{vL^>#F Oy߻p(˲6םoWvhuhszկx;u!ƟkK}hW ;D[1w9`cqplo}| ^ZI&]̹NF3OjѪa,X^4GpEv٥OiYb8pӚf|s,?/<5_g\0ؐr;BL?}d'?>FC5Z.Ң]rVͅ]R՘u?{!4 Yϵ3t^xwVv!.QgD7WjhCn*IdkQruTlIEvzP*꘭U(6Æ}L\*jԼ*.~k<;Mfc1Hjd\p$A+ȍ)5  )h)T2h Q7m[ѱ(Jr>TZL-X\EGߟ]j9d\!fzbU&ZK}.8116c6c6N076eZI@t)SV5H)F|_[C$d>gZ YR:1vʒ&SD0\;x ! bKk,gw uAi⠅p~ V̺aQBtɻzv|s9d|Y#mig- Z%%Ev!%ſ>3Xͥ"nU/FΔc w5%9ꂧsprMtFZ]&l`rBc K>!(<ŽRw+(J*;; 5\jv#!Q֏H"Z(.ҧ_d0)OVF%Ba[MYq9-8z0kLVKsxœiJʁ˶ BPf K?8Ѱo`C(dh@9/MzZ(Z0*آy\8E ~La ~ jbeM2$## 2hB[ziS1,V츈V)@B]bYv\҅ՀBͩ ~0W c²4r 1c ;A؅fGܨi ( a2U9Kp`RP 3@ (9nkLVv/uR A2h*3R@qp0^] FYR[l NJM7%bUXZh5=*(px먋DK`jHzS KV#4"2ʚ18$@{0.B>84q9 γPn.,&*JzƬ 8Y`#\T&^`NZ'O8'y`˫MgAui֕ {-Oynr Cj1H(@}rIQE'EmX=;{^gyNTUZt򘈒h; _rPaQB{A!?@E&bZu|h4)Aa(~aeo48x߸kTNVxހ7#2h6HU]3h&t*!Gԯ˄<:Y*NI 6!He)H(I ݉~akx_}4MC%9(P"PMuj zYe5H)"Devw j(<`PJ$ԅ,)wP :HC IGB&DlOXIIjМQ9<&'5B̓Ӡ 'B2#PE/,M I ЀGXT4 3Wt%GuR=o!X[N Zs o?pi:D9+TR\*(BiW*cRK-F^aZZ7hR 2P'$P}!?g)(JJR1EE B+} BH.O6| N?L-)gԣ!B7lԣbX7A+D$Z.gȊR#JT9kc$T#Ko]5MGEBD-lCQTH-} u#6hzDg5(x Ln +!O-"6CLFx9ERX4ZhR4!+1{dTu=TNH&mH@D&Lqė5md~9EQ~ae҃ꬋ_lK%a`e|@te7+M}MH<|p %:(`bp( _NTS?-iwE@CPht6,Cn^tIޜbJT+6W2\8p!]dԔ7G(N/~iUkB]U)H}dhjSޜ}&CwGrrYoRnm讫y/}K׵[la pk۸OƶGIR(;Ǣ҈ "dW$OpEWR ,WD$/pŀ+\1W bpŀ+\1W bpŀ+\1W bpŀ+\1W bpŀ+\1W bpŀ+\1W bpE9.\Uh>;&'iYHkUaDT5+6J$n_]ԧiGv z<|is=mk\ 9ZpN1 .B e)JCOM Y*:i|9hl;8öKK|L bC;/fחq_0~2q-H80&5잡(|E;&fwJk S4K tF}g/ 2%y[' ]by1A{>gwawMqon?x:`;w ]x;Z7wv:?J< ]N{~TCCߚ }zխ4^=g/`=XE͋/j]g|w/Ϻn4p{>Ƚm;@}7+7n\d(X!CҜ  G}lJ&CPDjd(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(Ld(L+Y> =mʓ\AZ#p)7Z33\1W bpŀ+\1W bpŀ+\1W bpŀ+\1W bpŀ+\1W bpŀ+\1W bpŀ+\1IWZ6 'Bԫ\[mp;\Fsc閞ILCXo[V?vkYTwvl]ϼarg]R W㸎t@B!+bbȠ6GmGz;m[V^Awc !7|L|a8%\JY{+Uc=7wqG >ұ]a&=~<sev[&7isou_"͗eyӏxi}7o$??kn}]8%wwWi"קH} 1]_N}K6?_ F\~wڞlW}-߾8Q^o\Y]*׸,KLd0lBEn&fa ܔtVA=􌃉3Rf`_ذ/|R_mq5ArHˋt1q<=z_P!·cZ2&]7A$瑪Q7h:QO ƜMA=0"KL.֥6n*66^ilMLG Bc؃c$nG㵇Ҏzm;׶9ݥQZHRPj]UUTA`eFi'2'\|V*jk# Yi7V N #|~|̊Ƥƹ`mQ?}fqL##vG=jcW{6Nz---- b}fX3ڻbq[u {wڛ>vՒקXѿ%Tb~'zW}?)6^l8mqv~&jDYHuԿ.W*3g/[?f#kDt8}K񢵭wn`Pqn8~4?VzӤI> !<NPa>c 0bu_?/Q0씶wez멐l2!@ l w ,tk~<,dd`4Z!cW;^j}&xͿ2X&&ʚ-Ǯ^>aQ|bYb˷SJ;J Q!uy i?Bȁ>lP̧]Z;&/wg =6 lpw-.e`qmȇMW=oO'˝Jz?zͭX>Q{јF~ԾuŊ{E6fgo耽6.dU])$,SOٛ;e{agw%*)DzD " I"+ɗ:jH)'Lxu4td$v0q9G{dH&'l۽\%cs{b=ݬeȟZԩ_>N{yֵ]zu8NZo'0mz+|U&_]PػVAj=faujvQ8o{o!TMp&VFW`RP?BrS)-N2''q: >m*{|R |UVm?vu .*Ea22hWxM!5Dӿ:s-v~|OZ;#w,wEIOv>`f>ƶzQiK>eR*'/^':*|5rLԸFve!f 18z~Dz~F@CЊ#䶜 .Υ^vdtiЫ=ֈ w=.3UUMunlM۪MډZX2,25l΢!R}6ܽ3vZQ*-IZfO^z'{\Q+xR(cADz%5 HSVgQ$t!YTvMaŨYAUDj1Ftp* Ɇ4!=`ZO@<~{ύ)_ !6]_r&٫5۴M?kU)Y')w ޅwtdE-/i f>?_rY]i2/}O<6D))&iH1(몬DY,_% Jn>O|'x"ۢ29)' ePdQ=Մ1iUӘ2o]7Ei6ab#\y.].]HY~W7^뉾fۍ 1݅GjX%f?Wg!i&j5DnSwhy;rߗWk/|}8~j39/uu#̅u_&z~xtԺK=]m GOqZȒѷY_yD[/h[hPSTC'~Z6,/ 77|o^j_^y׈:\\?7\օKօ9]:ܳFM|znFT1yTd WG hEDrB:!'iEI3"2%[4!TC* k&^YXТ0lvolo9: ʜ']sZ`> FI-bͿ Mu;c.ƥAv@2T}y֒x̿ޖ+zSқ} F5 *zWFE%$R10ZеcmeLk$KbOQ(> x#X #ySX&X:F)flQ S1R35J3rmxw]v]8Cϖݤgأ#׻GGW|F]0"\6DəN!NuOQoDn_A"s.61*ܭ0AJ !D>% .HðK9b Io@,L#b,lo-L͐:&H^V>Q=PPQ@(h&CA'ꆫOno:{uUJľn# {tN6ԮCiQ?8lO^Oװc-y;NŸ%BW\ @GmyQ,@hvRp I:xHZfg'<_,P%ԡY?.Ͼa]~bڲxc }U/Qe|9논 ~{TV ȣ$H5OG`sJ b $IV SYBjpB f/>Ȑ5Āʒ F%R&ZXJ7g/Y@nneU2կgC)&15~E;L ̢C ';w2*קci<@XR>a"w₍]ic1h-n"4WM0LY?_OC(c:); g(΢ û_ΰ3ivCPnxFT )R'KV1Ħ.jr4PJ9c{; =c0JAY +lpDI&!)=`)9ht(>5]!4^;OeLR3b 6t@F%4!'%[ooWt,ʪDcU %RSנ]lL*D&ZE>Zf0akG7r<R[,J]J.h W'":FbJ̛ ԧ5hY㐢b-&E\b_@P) M}Jj:VcČQD%WRV!Z $$tRy<TRJScԱ*H@HYvXaG7OHUkϴHw^A0>>;9 $T68$[iDcrwїrݏ?ŀfݡA#BgPޤJ ې$UTSZ:s(:+R"/)K#Rj|nc1%p$SMtT^_2VքIQt$ *dccgxwNBï_3ޱݒp2?|H}aSb"zkڐaR+Q]VE|)"MMwɱi&YL˗ yJ1W;G(́42ӿk x0bȓ-tFthRރ;TkGډ c,e: fJ&R(r+&LĨJAMc1gkعH*HM)"/TԒwEd`צ d)6gKm~] ^UGW0uތ <7?^~8_wU؍{Vl8f7z o䞴P1tJg׶rh9O?P[׻~O,bսytǣ'_Gwo9m ~pEgsrCnZ}vo>o #ׯL`WW<<6k=j4opow5 OB~i>t[CØ}>u aas}^=Nh}]#h}ZUp*٧UZ=J>8٧^UbC.00 KIcV"_-*s$ƔЫaIY.3>1^ͽ8{v٢O sFV8م ]m6 XOnNa낥+fey]-{M>p:j+3<s9e}6}0C.y]᥈m01Y<#p^ oIZVc65nd-bK7NiAZ2g4g vw滧7nTeE{Y'*IJ:1)iEմ;xCGn7g} kPEsh%l\kmj-R!FѲN-1a<X;KrP ?]oa7OywyRBE=D߇?Ɨ-~dK ;v(uI`yE_VcYgnpsqQ7׉nQ͟'j&V(Nv7LgCgcri$E^E:Y #(@.Z" _^E*oU,1 Po:ik|WBgnqKo?_3<>-nu'ҡ?*QalY7zS7^A*|u|-n.Vˬ#6SV2>`?x^C/'%+fS'u++Ş!0ijxcZݽZl;u+$ԋְ6 8(^Fm Wuz6-Œ7rd垄{K:n#vG7MIץ hNk&fīr"̶# G7י-3gw,,sb "f<&m; tU8߷"k-TZج-I^jLo<}ݠQ;m8izyEVRTMN[EF%]Fb1(L#AF!5y@JBuiUW`,c)ol6N-l:z[u ~㉧P8^e*?!rx2t,TYZÏW=?"]w֛Aj4~>Jb|M3k=E:(VvLEA;* G^GW䏨? EK&JB,c? ;Q)Ɇu1  ]L65} 'v2 ,)I}-6)xJ$mfS.}fl<9d۹<㩋S([wdoOo}>6iEӻ-k%^U? k_>WM=֢B2`u煈D:‚x+[[ Zw 7{|;tJsiBju]Nv>VY; Uh[ZO2BVY2 6CTl2d(N౞qlVh[Xڠ2ɐTJ$ '4@jmCx;%eI@cL-Kone^*\ŎV͗vMĚwm_e7GͦwhZIn,9[3zYHcil6ьCIE$sk AeTA$&a ʓH&1qwǒǟpWX^Fֲ`&U`eG!(X"Uaen(DHp`j<8bt/(BxiTJxw gx, Rgm,"y<"VCghϿXqa  !}B$\SL;"ga~+VC cY>0/ /_=q_ϋ[.'S%y"OWQ>uVɴ5Nis 0E{gRP8\uw<%x }]LVÀP +]ǥCRb)՜\ sf4qQO;3ʝsiJE2%ٟfWjWӏ148J1m[uCSJ/W`+Ӆ+UwOcݩ^OیrՓ_&,;gD1{ojn^FUR䃳7f0~ڞ_E]7^?efy`X(4ɒD}.zMW6:d]5c:N'a&#cȥƠb'툟. #[z,5ȆFǪޠzqs`>76~}7?9}k:? LЭ%AM"@37j^Jϻ~~꼻z]}eG$dʫÙ(w+@9E`wfwNhkB`fB۾ 6b %!~y—*D#P/6%)8s#&VYْ;U c`l4)ic~Zw0cd *0w~ .ݬy([;s=>DJhhx$FeQdJ^i¢?idV-peVk+)V3 q6DHiNVH !JCV5رj9i{w6lE}6je)j6[˷`zlQfd+[u &,*me,(U0xa2E:~eI1UkR'+KG{e=SCAp A'2mJ8՘'FScHD)b" 1Y@SJxKRUe?5rƯ(cʑoLt>QN/>0~zf`d:~]Rk[lWfo3;:ww"tbﻑKaFF UMPMrKH0׻$k7.L8?-Dp^rÜQ .$ ,u{Ep"8|9, y0("q#xK#7HG {irKۍ(%Rn&x9sԑaYE@Z)"V"EnP5rGpZ"8[W;+==mh79R;LT\wV` V>Ae6ĖHf2E. +4,J1y#މ͏N߈a8"%#-5KwȈN'ix58'zv+ɹ] Ӛ.xCu=.ź+|a&ѷꑉ|ޖנ!yk9it3"8*$^3n]j QA`x] q#VYE 62( F刊@tJwiY[#gIKf?{| *79L>^SAnQL`ogߤR%immnjkX#&mմۛg=XDSzPg`{n zvs'E0H)򖘈GyBL9  pF&DQ{pJ[2UBh*e;Q!(Ra:-RLp!<ඳ "g{6 r#= N:M4"A)2;ɸ Lj{GWGˬn YRGR\6OIg^`cwh+ V3@0(kLLj9+j$>g)r@H=DµKĮ~wd.곎-9e:mˉ)JrdzȔK[M5&[["8Fh/9vւN(򅖜N)bxЄ^R8M_q.'ױb2ǥz&}nWO| =Up׮IvQ6_F-؍L`\$ϧ\[uaJ)ILy]:8yi`Zg2Х;Z4QkՁ+\k*}å%%CT^ՙf]kU Ŷ$Z_O{-i -ēwnJ 7o^٤6#dJm\I0⺥ȵ\ƌDn\gM+k!~3=&M%!\5IBU¯oS }FKשX4 jt*b"ՓQ6)a iB,PJXod>_&t_(G^ȳpP< yQK+=  |5D"(O%"FY#Ř3kEH4tjw bj x񧷋t[ى,7 xrkf7Mﭖ/oo|zӠ77C,oot3~bMx"fh[&\ v,z5gßI&6;^~ 62P ԩvHrc$E(U",ģIH.PycOHTbS%aJI~[Wa?[cnpFvVu1qm{ow%96WfA/GK:"Ӈ ?Bŭ)ڲۥzrO|>wrޠ7is x)-KZrNKgxmLdh\su9!.&Ak+L^D $7VHMz&h 2-^DMmiҧ)F`u>YJbo_U^3}[RHޑ;t9*>0ϊ ܫVRpE[ Z9|2\.%tTEF6#Xi`H䂠'EzΜ 7җB>HFjlHvr,b!TBaAp|{.nx76}g{GlJBϢ`v&HU), CCA$ YY"F e(Gd`S Fm&sB!-+hvZG0qy6];vEmW]ڝ2gEYRJ u9:xF/=#}qFe JAx!a " 4CȊL,0hD )Q5ʒ9pj?&``<D;"fuq~;DlxR(U"% BN&,(jI s'ɻ)8c*Vl,hg*2L#YBY-W] o9:qɮ+"pų㢏. 1.`ŽWEyu'F P`!)~MnkhEi̢ >F7ڜJm +2=ta^CsB.vr&p lЊQbaą&a},q)ݷܫ8'^M4q/u,[eG'uh5!&(ɰqV&{ȇ}Kde0ʺ)ehwqsSU}(cHi@]EC#,)R,Bn*΢q,'5!/ J\ \j~z*TG+Mޔ.1 @{?Sߕۏϡy~&_dZ_j&\W1҅W0}DN|ب#3Y,Ϙ'<2- Y7I{&sBē*˼Q,yj%ӣ8xn \&/#}Ԡvc 'VywRz`9FG.1od^#!Yb',Aҡ1&E8 ɲcƓ=ct$6~bAd0Ӝ9!HDc4jWΎY$ `uԂ'5'G~d{-Uf󋰳{9^$6^̔}h -Whx.>0!MZHK Ҷ2jMZk骑A3qi}?NZ'I$ _iHav*Ғ xy}CtHqoI$r1HE&vaI%mVIk^A#wRWT- 30hWdp!Je)EELQD&$ ,'Ex&%k7YFΆl_ $CNHv4eP2甭J@غV[+\ڷ_o±[~CWip5qe|Ӵ0O"OQ>ro{g`0hRhrWvdbOs.Lv-.v22CymZԭ76YhfiZҴ4p])fb1v 0Tu-YbAtβ(A )7 54`%uxv Μ MTGYrp]+t\gPC9Kε ^ƀt,gRk.>Ju$+nF)]&yb!X')5pjWFz)!tTzf/iq+?nJ\ ,UQ-Fv[?YV>:z/7T1{`]rB7Iʩ.ޟZgJJɨ53(Qȓ &ITTntعRR_swEmoj޾'yц]~/-dsΗmsߩzT=y8UO(΃zKw4 hT Rp&,JNc-vr;4(0A[!e)cY1  ~2QJĭ%/y.Sp9P`9.d9BM|Q Hބb&yH5ʬ*OT1#D3X՜) Zq0W*qA%:}It06SٜFΆ3l>'v׾fd7ӄo 5;~|j- n4<6iy{Svͮ|7h]\Iڸpf 7m{"\Lp4lrkn4fM^XȆF6sn՝X #Wh8\qǫϫ ϓY^='{AuSYȵb?4cz?O.2Fd^;GY\:W\kkoR3;}]|jow%N9D6nf/G:^wk!#tmehY{Y^'I>wrޠ7iGG <MB`Ė% yICH9YĊAg6&bt2f4#0)f`nH(IgZ ǤɍRKMe+;^DM9mFK'nV|+(|TJXB!f9o{o}%g#.+0L{bF*U6mVt ״KI#4Uр͈9V2 T\4@Oҙ6F3TmdF5Wj)'j+Be, &_%svqCo8Nş h|?|ƓA(Z =: E| U 4wY H3xgmfů!zMil4*V8=. r̢Jjp ٠v-]evjwʜdJ(&6sٻ6W،a@7ȋd O1E*ʶ6% ESCGik9]S]Tu<:͜VJK. C;C <1EJ: XLVq@Cdhю r#(`ܧr*Hc:s<,&# }ǖn%b}^"Mx GE/U<3(A dh$;X"x0\0BB2Ѝg($^2:łQҠM0I̥EK o0-}uv%Eܱ\Ľ\^ǃ>лA>0?u^^{=I-8ܽL^pΑ"g4ks,FXƱN>}YZN $IHʚd{vtH"BEEGVaqF2@EZ+i,pp&bjK*Ljd<)V kʈhA #(RD;;#gC7‰5.?\>E-nyݬoS?O~y?%="oO;n PU?C;Rj%M RF;+` :)C95,E^"D"$^ٸ `EH"%yv*iYP|T*"b/OQ!ݤK󶆹oVw{;TwDOZ2'=no(0M+y4TU"EzDBؘ&r2`3LO}>ZEO/y5㍌x+1,hqx"E ?e%՞*X}1KN$ϊng_{,U59zG#e(ÇR%ԏ>ihYE8hRJ!P` }dxtuZZ}p%6]c]<`ZA)OwLBA.:1ސtw}w Ov=*;:no>-︨ؤN\nWN`p$"_=_. ;xٟ81=_JfD9r\끱."48Jq$I;ŞGl9Y:8$~'Dq* ON(!3, JftX#C^2{#g7R/q'yo<ĖýU\B*]єRt>I5ZtLZ {'b:<,ߍ3IqeB*%9x ZǑ(Tyڻ8RG^t$ f1qZ@ >X`Y|Hxa!8$)f6hUl[EĖPƽuH ÌМ]g6ꊜM܏KO[7Pmyuza'UŴu6'ܗ4uu x~*Jk5/x(|>bBι`1X>(uYG̦Zr;Afxfc-Cl2K_0eH݈xP 1T)b+tÑ YsVk/;T->c rm6h*3շif3RP@0ʦ/mAPl0ɥnmz7}?Z]WOcϾv>o0Oa? ?wnP (zQ-mNl='Hs`v& WIW n< OZҔ̐<===zW^El<f 㯩$܎\Jp~}v2[j(p;*a 뢤L Q`r)*rX"g*[G+0O50 %$ i~` Vhuh[5ɳiSC|煍 ?MuQe-;մfU^v1} dO߼*oIVrAߕsà 1!?#|2ry{oW/Vtg \'OW/\Ydu2BEn0/aሟv9-r͙A6'9Zֻv ɻD!3H\ T y0+l.x- B.h$4w ((7j<$u, x*-J))KE.s3$ZB@&GZuh?'t&C>Jʻwe=3v:hoV{ƣ?#tQ)E" \ +GiX\IczZ)s{p{pD˽Z0Q2p9l0U L nnRlŻ:D~SͷnRʵ;VJ[nbaSi̱ 8- )AIaYF&R!F*vKi$x.nxQڨ$ TxGRmƥLvhxLt1sp6{8[G%hC۱H粡Y#i`xҭ/ _1KxgK7UdM߫ FUs?^}25TХ5p3_UJ?d/<g¥tM5Ey,#U3oU"s-Z'Dt5!0 ̈́Pn~ .EKd2/Ƈf3_b; Y5Uqt={V1[S8 ʛ&a8J4`h`W'`, *0O1ӀV":O)N_}+u֋Dz`Tky4€>2Q2 Hjz[GIK}4>z{^sMTa9%jAocT`P\dx!^U'XEŏ~[(jFVѕN~`=^C)T,b< MBT]+\ql\'878 Er6<Ncʠo#)q+c=ĝ>CHTl`k#9: O=cF@X1b" j]4>Q-r7=UAX_σBfjϗ%Ur=mY6{wOP R$P"X=\ɉ\J$P G%*q =FQ1>WUItf0LևO_FQŔ4NU9ARk@R>߮=O{wN%}/ZO{#Y NHTcVJm6FAXT0dMĈ % $NGrkDyr72w7?b\F|O= }x_@/ O; R}9r ;ٻ6$U' #a@8xmgmv{!SbL r_̐CRTSQKivTin A$F D{yw>: f8= W= tyU9 o1s{"zw ;Pj (1q{)@|PoEipvn]U[1\x\Vr&ɸJ/}U`E$[s|["|V cgmIÐ"_jY"00{{ːDk7=mc㶞۽Bvҳ-uUp<+.yH Ӝ'`~BO lXf_S>]YCnZ~`THCKK)FD_)Q_oUx?tY`g8g% kY)'m,!d=jʝL<iX{))qJ7DvŜ(arWe#g`dU>n2=vZDϣ&el|abޥt%(-oUИ(Qľ"Q$(8DQ^xokl~!.wbg=h~ǧ1D XɴJXK 0 `},GQziLpQYU?`4 8HY*zg՘ hj^ˈi؀Syk+%"~55%ԋM']GbѠ$b֕js\Z>wη,;|//i8-+S}&ݰ]fWtݮw=>}څק6KI7iϬ<3IZu Ťuzкezy0jE6+ ٬;%,[n]e{ƫ;ocyr~He]y|_#'gKI!sx[ l[c搏9%ywOcߧp8D [:pktAj;#2Qx- yJy&# Q7/ XI(J1,,VJ\擃lprFw\U߅b"db"JLb"s(P+[\qT^s'ӑ)&2$†9%t(X K K#[氂S -5YQ(֢T[CLE @Op)Y Р-3Ffd jRDr#c6rV#r-Q0gl*3 WΪᲹZuQAi' zhVQy^)v2 DB%9 9*(o YE#iHQk ItY$R:TS"g5b/;ڹcS֙Q[kKF2)ciRZraBj3s/X$N:+F.@☁ !f`E; c$> 'H P>r ɌYQ? H+?6DD"b_="vp !OHY%T<3$02jA cMA: 6QzHL8cuƣA94aK4s'Eqt'^\f~יK6Eq=..xKH>Rʕ2Csja 5JUBhFRZ%AM !J7=.nwqǦxH2! aW1s^{o?@L/99B1PpQQx⨣Fn+(( s  ?}*KG6JT: OQo)>~]j@lS g 7.:u٦&/AzE7 W#oS:.ats\%w7"euk}/6ѳ;Yj uN4T]4TWih\Nj9O(o޾^T\z?껇|6L`뼶kkZ S T;)*i Z|m1kX|X9?iAg>+ލܝ4bƓMǓ?Oo߾z󛗍Hul"~+&^YyFZps7$'t.eG%FRMA,[u^%2*ޛ/zBu͘GVu'*&pv}"3]9@< pfIh0Z|>hYE84a)A(j񄊀}dxta<_s7uHݎV-e~< UN0OLB@.+[}u4[~M[Q]+hn6e1Ak_| ocr16i'`h.Li?μW\PG9^T\d=X\ ?$1jI#%OI8!~EdgIYj#6heL{9䣂H#N@%ri)2DyeA`kd(RKfbǙx\Vaz~+Ax'ͫĒ:zVCkc6r@RE,#xKGBl>0P22Dq# ;0 =źJ/:\(J'*4x2O)d)t(HdaQiƝVr ʓH&V;A]f(`0 M͸z|a],+:hQjd7zSF*зNxD J$ >pٳu1r=?Ҵ iTbdR010QJ0 R)=RQ'ᑲ^ FUr+AY}<\uA!VZ ?烡oZ$KiG8Lu$v)5JP{`gNAG77ӜFEcq͘L_U*c mN!쟡c)(훪 vx_MOPo%P0 8+:mq0R1.*N.R0!wTڎ48c/L1 ;uEqfj ig fpFgg14(J1%컵Zo{4Zx9')h!"q+Զ.i>u;ϧ'Mb4S`fnۅ"`я G/Nh>0:dmX-#Q9㷏t5 w?"|Aa41 f0bZǣl1鲇sT:{ȮQ;U#b1JGa&#i`xRW,ҎŸ9cSEt*'t65 78;v旷^xUW0QG/|30 NԽIx/}xcзӦy=tme J?A쐮>H'!P i CF{MAU &fa6!Ta]lƸD|q>.,|h>;5(!6Sޑ+%xHW9{۽%uX`4q.u}%?+à z_&unOLuv<>;PS_Q-&+U^cϣ GD)eGR{o> Zڥ;Î{q#0Z4M` !# ,BwX6;lѱ wXߝE=tEߊKߪ-po9Тs=+5oKwts]e6EM|2}/޽)ȰImQȃklNdy6}oW0,SCgk9~fv%Ә_JeJ]2xi2eaI1ˌv][o#r+>%m Cy9@pP,w4\%mʒlb13^u]&_UW}?^V_q zv'5 *&"ZMFQ%cLQrpT:aTbtieg'@bA qC*w,>D@KAFuـC'&%+kL=*]l5<s}.Zzn]n=M0:ܠ;F߅H9?@OGt7ofQl0K? }k<5Q98Ͽc)yG~^η?n o]\e3p'_0P09@<=X ]%"_kAAPнT2$Zd+ JC*u1(B%M92{ aj 1 7AsŘXP8  Q5TT(gХux3sC^ ].ʁi1l ~#xoGi`%﫹jCWk{ZiaY=gSsۀ y?!|oc /= !gC nY'EliЂ,@hd$qQZ`To@,6PELm \wUڕ\3ߵEwJ~:k o>2UWit6t1xtrJMh3iLLLbM{2JFRnFA}`s҆9B,c|(Sʐȋ+9WBUYsps` &Ь.[]SLs`+⊞M" !qms0Q 2ܴ(;|޻q mT&͞S wB(@5a2a9)N>BRa#x edt-%$Yv `bkv!B()ZB!ogQɛ*.[V}3])*),hK} dcRAC C"`׎n1+eV eKNZHm`$+u)8Ȓo%BthU WH)0/_P\{#,eH\UU)_#s^yWKP9a1I:4hM+@ッP$H*p1`ɕUh(ThATWDʦh]OfZ{vɠQ:P{ZWtƽ3$_ SP-2:ZpAk`Pe$Rט3` YMғu2No=G']@^>S׽Oe.ǝ#[Ğs}p?ͷlsӗQFuHGNϫ[7R|8:mS/}/{gjh]<_ll3(X$貱H _w칭i==}v#e5ԁP `:\ww Tkjv Tvh9HM}cL_?q>9|>3`  W2 BXrх+ͬD u?D]{i {]? Z8_8Uw;mw|Jyy8|tS9ӿRV.eb|,$Oq B*GOKV$K8l4oyr,>O/x?ǣ|m2M6"-/hIޛo.mexG6Me-Nenht@<;CGx'}W/e x 2vW]+cwe쮌+:2v]Oݕ2vW]+cwe쮌ݕ2vW]+cweue쮌ݕ2vWﶯ6bW%x]jww.5ۥflR`{H2p蒫u.%Wj\풫(]ðkv-a5 a0]ðkv îa#gG \]ð]]ðkv î5 a_)]ðu îa5 a]]\^IKص̵;WxF^IV*g:y ~kj 蒫]rKv.%W{`>0JmxNGeMg o"@!0a8IIeJoPYBDa(ʨH˾7O\mtX 3GI&LPB2d1$Ht)@C tIkZgYy0IҚY>CM,1x7mw;|z;z h_ƓoFiiA5&H{63dv+!wq[j(f;5ftj}C.1TR`lRkSKfX"(HIr {ជ~#4lbd^#TKwݑKPxZϐc! **3aqSfRuPLZrL2wXf*Aɼ `CrFRCQUV4g` 1x@px4H)9־x;("mt`E$K&T ^=ɽr?c(>{2!jB9):J'U:1ɒ"UN䜂([Mѕ"&,Hgi]:(RJrIZ^q80ZYS4iU$R{3s'^b:Kv2OAƍsa||'~D}i^1Q'%僵Z)V!kuZi %[D&E6H/&wkhULNjS)ugZo #8~!XY:Ǩ/Թ'\H'Shc)Y;LFc2I*."xjZ*VJ񑚆.gٚѳ0 TX%I,C:EeِMl߂x֮8eI^OnOڒ .x/*k˦׬|]s./׌w\i>)p5n[}zO~?TjH\՗99snN/g|0_v}tۓ_|lp;gUdn&sfd{5KX;`}.%_T*^X"<*{|1XRivi' ~s! R :, G"P˵ΥCk̠mSh+)Ee ]O&Oް ,dX/jV]36/tn{s"%&'?&4_|g' 8M޺ %LL;2\Pko1$(Vċx іkz)`:Fſݦ,D ɩ\NP 2*o ʜ;ӝsov\{нvG+@st'm Oj+LLX!4BQ|k78Bfd(h:֤ d80bR%k!6YAuOCu͏gzD#zGqɌǜ & hD"!3HVGcɔz0PDVUu)Rd mZBXgZbɁ*C<:=G F'@c̜ N3?~qr]Of~Q6W;+m*ZCk5hYSZ s; J2 v /x?Tz?܂ kX}T"^GkͫϿ2PLi }1q B袆1@Qjҋi>'3s4F''^h/yp}*Bd*^o)QPE $.8b8%a2f hU(P[ S -}L A"U Ae:KflXz?>5p>z}j NjV4]|:;fDt;M2ouNJ~ϯgι hHޓ.{c2GJFA("G"(rBk=V>j6ˁO+O67RRz~tY߁o,to-t 3x+?>j6v<6Zjn{{()Kս\Fn:( Of%m4T4T4T7hhVih.ux/wo޿7'ţw?~zg$-bi+o9Y;ғ,jN:IQQM 4goY'.ロ{q[O㢺rM$5xrR~w?o_7KK&%?wWv$< @CFXұXW%5CH2mD}l[A48;ĮxZmU$4ʷq@!uHKd}@[R O8#ã(dML-E"v ^"6͊֗WZvq200=}]-9=tڥ'ff1@;k>|^-޶db\5lN`hw =y'/\#ǗwŗؚE`+N@ΗĨ'Qr<`,7!ezEdX3$qbXy]<"r/=3x;4 xSJ)SO28ױo;쵤dQOKA8ѽ=9wpW(>>D yjGU" /q4d^)>ohXFY; ZDl ha[0 ^И;LT.rvqC27c,L\ϪuvO&#&S2ثW@L/^?xͨ&<N٤'Ǔ44C0)p3@W/aN(EKfMH"LK,&3Yr7KC{{vY]2 y0 cJT5F' !^)X&J9@AmGs+:UG)8X+MX+"A7&ȹB_]xgv@>xo6;`x6=y$#$$:nhMhsoxT1sIi `Ra( +i a3 Q9DkAh:7RKG۔]ZZ,!#Z0 kqYא`;$!J 6צ,ܮ./ c9q~GÙVJXjvRgDpTHnfܺ:[ Ҩ 0- G"h@QAsglrD`@:%;A ,P^ȹ}oY_:۳xd֞ߖ`LѤ<+z$KzB6k@0ج%r }*6k`f-Qdo m8WeWBargpjHLR#J)dZ`cJn}*;lНW{X>a$ՓYdǣ[S{#%kC1|'}Ȱ֚`C1X f9 6 -VI*yAE ~DgrE VLI8FJJZb=53f#gG}ݜH'%(V  ]: :'4ǻ:LӲRc=rV.6Rs,#xKGBFJ*uӲLƸJ#o[sI%^J7Uچ:FTxGRrP4:&0âҌ;$&%- ݦ X3T Р u'3+:hQjd- &P/xHF:qH( kN"U˞])M| F+#= 8HKR{N8#eBʻg  iug)G\Ҹ0vXZ _WoT-D ͥ9 bU&vتqSKw/*~orS2{R"&Vi6GP I]_%E; KG#L9DGgRP;2U.&xql~+aYiC(\+Ӱ\,a@&. i"ca>:_b> uqiꍤ%ٓ_m}NCCUt~v*M5ZI Ίs8Bt)y&?+}jKMt+_'?^^M^-h" y0y3T/vIQ6*m䣳Wf4a6ƖEW3ݡefy чѸޣ`(|<^Z9LuNV ꬓ]vU#؈ż*ˆI s``E;G77T"[*5 F .ρ7^&|oWup|#0 N$z1?^9Uzt-e |ïgŏW_ARtY5].A++U3򅟽]4pm"r l@b:\lCDrG߯ .|(wr[ӭ2*x@V`")cqFi!ԕpf(1';P_Ӯo#2R N *+n'8wЈ\..Á>P#uvN:hDL!(# h;<+G\]\nB~JqtsZoͼb+tЮiGQ! 3=/WH۲8 ߋ'&p՛?G`_JįPrXURK:(^-z<̈́`@+__MQ k;,%/|AQ zݻM[sKނҥ0.]E*- xJ%, JFK3GXwVzNFFo% c-$+ʓQH1H3 y鳪gɪ;tC%~dKoU&>}Iv`Fֺ >zq*`_dMTZ**3vy> ox"H35k!rǣp[8s`e0/wz8BEVaa\Hi@%բֲb+S~g>~L5_;kH_'EԒ)E  $cOrIlws累9ò,{ҪMMw1-݃MKVPWCx|*.'Ae@ ~ Lk^9,:LC' 8p # <@D4PAI60Gih0*X=!XT&8R(wʑuY5P2(#x0k5f,`ZFL&Z iȬFΎn]|H'qGh:wOLvy]6?f;T8뫮i(zu&=Y{Ru{ ^rZi,և}3[Ic}Kt%۫N~T\LjPPu޵q$e/ s9㼛E1~TK)J!)kzH$>D5l_WU#'w9m͆=~6e-)nΧǫ[r=? 71e>mx%2 Gg9<;¨D^lF,S/ :~ \oŸC/jnS(d#lބl } Ԋևld*5^G6%8ao8g%|Ai6XͼMQ%eZ%kQXwXǍɾI KrE;\) <~QNw"u[S쏪Z| (  pMh+q="TIB dLTQ-uөl ǿ%Ep,"ьKn&&1S NMiԢaxLކ|d|RI|9(gyi*oΔ]I"vI% u"-t -ǚuLZ(*q[Z"2:Ř'UJJ2&4XϢՙp<Tid,F9΋*Űg슅0  ǗY2>-FGaZɟ4]~_;b&8cё }4FiQT3ilb63"bNY)Xe8j"#{GJ%Q3uBݎ B)+MRlFl> *fkPwڶ0j; vGaP;D.%2Fd VkR9AH 78KeT"1#L,I*\LH*PCFɐ ԢG1IP G(ds9Qsa/Bƶ b1%):Dq;"NB@T B9#$#^Vr.J-cJP(-P7)8Iiʣ,e#@9PfBs920"#g3"^j89ZYKvEZi. sVkJFɽt"%%9BHko4 dP(Ŷa1Yaq|v2+Es17:kfŝ%;sBv7Hs0A)}>[x?DžJZi}ŦzxerMF+iݢa2 Ige~o?K2_zӏc9{n 3NVۼ oK VsJaFdz'"wrR'\3h7լaqEoɍmXmjxTޗt| a|>jRg,zC:G0hKj1decUKR2d>SP!/р=Io4~>lMp_@j Y6b,^^|;\g~j䏫Pe լZFh]b:yUާHS2TDj i}L%~*VVz{d;B/fg5c=TB !Ë~6jIub7'Z:Ü˄}&~!6tD>^AlRvQeZc H9PXbs0M܃9R8xk"&DHp驎7~&{ߨ1:__|B5IP=fDSTcl+DAU-ߍ{SX'柃L]07~ޔ|c++fQ^5xaֲ'G,^ \ 4N^ %mtmvh8LJ죉TK/oO/$)8bvm2kϙ%L_`eW AaESߌT ]%WYۣ#Lf{s5t_2V 4wGn_͑JPGW#Xgy^TbWp%;3"{W`)2J \ej i;\e*-'9lr/p2WX- 2to*SpT +Il*lGDr9e0SZ f*9•(X\V׏S =js9)Q:jl`$7i2Ȼ17ʋat|{m0ϫ&ٕtpņȳϛq"W7ɜ.h Tr.( rp甇7>oo8B,bs@ɥZkrg^0u5Nw.B\5Ɠ#|.\$Y™=} Ѽ\+d 1BU+|f5 ob^^q[ ?EV4m]ML1<ͅ^O_^ncsbܿ{N.7)_ap,YIѝO'z [z@wemm,r#Hagcܳ0 dKRFU}֡.X9V+YH&"/DʃDGQ;ppu>У{_W=4r]X}C.gR76<}2N#_pʙHNQI::_׿˻~N*|o޽Gu#0Ω2/->K1xty%^Nzvbu'oU=/?Qd_Uf' n8VO5JtB<.obtӬb[ |@4wҷ|i}kKO7]D c2<)Eս"t+mg?lNuRN aiUs9=w@T14 J 2DLΫ^ꬿzgo|F]qDC(+KX"dJJ)B#Apw;2ׄd;&ɼ*<9~- '!Qbn VZxyaݟ$5{pHfNJFA:qE}y?nQ~S;6Jb L3 T}ͨfakFaZ2bG Q)QT)0^*0K .MΪbXBz%gn1qe;n8Amʫl\ybX_Qpfe_iwͱGBʪ##OH}2֯(q'wygj\5Ǩ $؀9kəN!5Juh{44ݾѰf H`#?LI1i"EQ/B=ÌK-fhB>dŘXwa_@D,rSQ1:uBE ?BDŽvhx#|n3jM] Wٞux xQPuH+s£p>Kpu'b\+.@q]SpG`;Wz+7Nsen%K{PpɜHDuEiI^/P)H)BT&kAcEKCedYT,[[B1Bl&Ά1>1/_ ݥs^] '9R,D]8>S{POY+A~#QF'jN椭3)Ͱ0i frH)Uz4^;DRX(Q&dR5l%z 9O@^G9FV%ДM(zX dc$kAsEN(g>cүY))X2i!΂% +u)82 o%|tA'^QƆDT<;pB}̇FؔE0cXBY{QET{@,*t%UP I7 2 )X51$CɕOh( T!(LY<%9־kwF MbgP<;PJ-bl<Mٔԃg<W-<:ZpNk;vl|!#tcLj~_&d[ ۅg˷˯uS)12H2m_5i&:>擋aY]2?@yo O ֿOσ%U[x=4J?4jٳ [QU=YN_\\=Yt7|5w6{l7ii03 v4s5I~Ms:4$0{O˔ h첄A&(b]u*xbx^ $1'mEaW[L2 (.E(hB=jϸWs ,ߓ}g7iZ7{}Ƣ-^]?x7nzz b!c{9,d,.|),d,wYɻ=wȳ! >ٔQO.*1/ӂeN$6v.d6;`VNzj*6c;,J]tĖ"V+[M c)Z)B`iB9_ _ XA7ꖻZ}i:P~ >z@Ua&0c,L1vE3-@U0+A{Ac9dz*}A$#E/^XɒTTȚ R{) ?>]=6RЬ;;AwQY)߲7CjaFI%Ue+-ș( Nѕ( D1K#(lv5>7uZIED.Aj;q`&L@U Cjj&z+U >- |xͪvCd([^?n|owH}fGF'M hm1{ZbVE|)"Lx,z{%~LNOM)`_P=Kg7Jhڻ/?^_KX0!*'[{agνS\TkGJq1 6fHos$CTkCNyŤbIT)SS|h6.FdpB^dwEB2+w.*6gCn~ փ׉| ་0uGݻA7}eW]K1b6.yëx'cήm,JF!O?PkpRoynqݴy85޴Y /ќt-;nfݮG=6^,z\=*KWWg>&s-jn〧_]dŷܚ;OBfjtS%m}Wo>k$S0clsψ恦&clӌ oip"~ $S$u|0eթ ٢(;5ٍ:61ٷeuK}MV@%]xf!Bdcȟ<v%dom/QGdϦD]f7[.TeI9L(x& jM7Cm0&J^Hַe+mW@)7F9TK^E^$NiAh(:'05^YPIYsip梀ؽ̓{QbQZT(jX$݆fOw@ M/к 5($tsT2lZknm-BTb*Z*V=9aB2ZWg)\&+5;3VR5c3q6 4ӅfƶB{9/ I^=:Jҫڠ kب@LPDBtr P+MIX5/;WDul\@jRojS[,D 'Wض[FYE$3[^cG[͎mo}{2S HVJI&<$蝱! E!`eh" l!32Vtl:VHVe2pdHlTgZn&z}81AU1Fl6?ԈFՈzk%3>\iFT$dduKNAZ lժZTJ59ibـX'59PH1!dƣSbKZIHFl&&g_ud8G.,V/zQz׋+*-Iq[jmW֐dM񢒓[FFEt d (zqzXaٱ>TN@5 ~*qU\xG]_{݋wgqB^UxUJ=wHHQO;7-em|֢#k섏]TC(uecӴj{iOf术ػ%C:яAtUȩh)Q_ 6BVY%cIXH^Zn6'M ւʁ))y ʖI]*Y" C(7vr4gØR闯Mɷ gjH a*(Dߴ=hFAkswO#Flڎݒ«Uwp'%c*/T 8kvYт4!+dPCUɑ,W$(H*eF9ֺhh55VA(:ɷ+A$y"(+s6RdN{Ȳ 8Cn;Ǔ÷o?' ٻ6dW~}vGE[I 8.k}xL ?3Ë(D#iыu`LwuwUuUuWY>̾.yS]O}_{ÆEб; _*Y|Ykq\4iJ۔:3B=Hs)+^i{k-Ux2{b E UcŤѴ05wwts⠴\;Dū,NFizv|w*GEA-zaMäoeq\E4֋%@*A~ƑB:]̓*.Ɠ0с ]OGKD- hcZ$24' @I}%%<|uٷ MsV㿎JE)g-5ތIO'n-H`|-} Xk!M7hw-ClE{AOW|[?'C&he Ma\`S%ZKZ@cӴ U aPeeb_) 0XLLvP^˼}G}o\G:cGmp/>0Iu,a*kF1xǯ'uI`o90kN5qoi7ΰ&"pMCU@|Ϣ^V39Ĩ'QrJֽMq|os.<mAx b94ZᔍBX$0CyR* Vv !c*eqc*h%{gqoR0#4'(xAcmNZLt5lCxf-jc>O{&UBЏmX} 6VuzՀ)+F^}{2$!8@ 3p@.8@#u8Db{8gl,1u}TtحUD${u irU"XQW\%wE] D%{u Օ`ZrC V쌺JJ+ w]]%* ޫg$"/KWYkVː0{+tE#9a⇜"-!rd}zlݑy,N 1)0:Ϙ>⋯77cjɎ@QYaގðI/ \Xh8ªoNbww΂C0߼&VY. &as,Nƃxq_^M]0EXd% =+T&7uȸ*ؒ翽{{Z(ڜ-rm|6#oU}m"k/9v3|+˝R 3|;D#%b4TEǸOqBnns1$ bҹV7TX)'O33\"]wݓKT*䞡'X5}uVxgUIΨ+w?\b+ûTC5-u}SȖUVuTR˽zJ4WT(fc3)[Ò)h ~kkJB;vNu,Jh[fJЮv5zr*BhAx?^hkMMR#õۂ`~TrvbhxL daΰۈͰ#xBUЏG;;i03,ۍ֚`C*:1X f9 6 -j#)6Z%h]`Aj#Q0!SP0jve67aakYɏf9w(11 DI尵WǓ ؅3ۓ#o̕glò\i3kr 1QJ$$jTi/=.:_u!wk&5,}`CTD2'# J380\!/2RԳ3{4W^`%OyMŁ[-rMCSJ/`8-$LNW92z=>?p݇Oӳ7ubRHA̹A[s(V|͙'wG {bΛ!-3GDRG YL;h8Mtޟ68{%^gla"^0f2:>?&ˋW@IlT!\cQEniTOV5V3;~Ǔ7o:N|}w~:y}u7'z w0`c8h$AmM‡|sC7iu\v]zY}7WR9/\Ba|1e*rB1C'YU+d&'D5!0 ݽ 6עf5%!7jyܗ*D !Z_Y4>HWӟ^VmdN3 &e8G<>'Mc/#gexPQ U0Uc~bt=PK۩KU,&+U^cϣ GD)eGR{>ztd RRdqy5QQ{X[AX"# +Gpz% PBu \5\~bAu{ض96=!7Cخks[GOd1:ꏧ} `^"f!?k$ȉR4gٔ@ro+l_f+H>I$$f! ] n0woirA{FBPNci|&.ʍ"6b*Fc@lVJH"E'v}ȹ$ZB]2f&Qnx] rFDV͛y"ӇsӖsnu7|4j3HJ$A3E.sˣ4,J1C/;WOtDy?#b QkZ42_x %'q/K&""k.5FxǂygsS0s05m('q}**q*=B/İs8!,Vbd[b"2O(xL9  pF{Q0l*mԌ ϋbiKO|?G_K( Z:ziiԲ &ya.55mMByu4Ҵ4nMӺM;k]$Gu]X{pi>Z&ĵ r B($@b`1@1b"iZ+It<Ԭ<פgUvoϓ?1rwmͫv0F]êW=HLs_ y1bJRzTʒtaP㞂lYfS{(YJ+=j#o% c-$+ʓQH1v y(GPB Hh) LnHN'V°n4ϫGb>9hpn:Zp.'sNƃJcgmI ȗZrV:Afx> Iwc wObN|lE.YJ;T*"TܚJQz^-Li3R!rFK*Y0R`D$2]wT:aSp/q$%%K(%2ZR vk0G>h)4dݴ’E*L[:G-oP9Ug4IصuI7l5UUӺN;|K{ Rj>::YUA bg%۟JMCw5YvuoBJhݼU{+oiJ7yX7m͛y6l]SqK|۞\WpRYgE7Xb4/ܧ:> ; oС:ls[͍oLF3&h Oro1Εc_2m\ %:z;t|&;'fs;&dBRm<F(2gc4 AJ Y&`#v&{n} d%;$d;[NTmqt1tj9,!9GwD0\睦.*λ -,0vGEgO>R+ DʩR ˽)yfgD G:,aryssu2Nnu>k+~'ڱvo6^e`ΊRq:2K\aÜL:N5RH9,הBIMEVdJF(qSbr :#32x5):DȘM dUaaq[,TPvXU,\8͙&xKRss rm;)P`uЯa/c ǭWJ: Ƃ(PD G&DRF7r]Qz]HƞN ؤB:`^e) ӁiZVDNܑYv4s&Πvq[֙Q[wi+2)0sǣi03ގӘ{"QL%xd ,d 1+Q0Ӧ r0 .`Tir'&z7—$lTCV,9|TN#tE.#&gwXo_#%/I{w_MZB+팄nRҹh0yQK&[ `u=ju`As=4~nBMph!]7Cۢ6 3{; ]pJUcCv:(6k%h*?hKvه-gaՀxuф^NLjQf ({9EXFcq6:(N'Ejӫmpo B/)%e}z#aKp%X[2g15c;'V-v9̜sovν}'DjG0pd^dEV8$X`Xv!u0f*Ljd<)V2""&ZH0<1̛'}b+a!vV/JO~Kgj* x:+:r+ry^Ee}'IyIF)GT[띕{NE2DYARD)a+r;;` EJ sT:cAyC u@JK(ח; ^/)>~]jqߧyf^caCŦ.,yyz Jsӄ4l1o.t].ohyҗ/zf7,) wf%U2TeP]Z&C(Cs,vѩd4Oͫ!h,[o~!' 'M>[|Lt(8pFѴ@bVELM;XWx;*SwܜC&MGVo޼zfJ-t9oye 0ѥlTk !ޫi3"FS0Bx΅aI 6yn Q{d,Tg,͒9zI:nd^Ha72іn ]"@ Sc߻E$iT8J ‚H-RZ=x#:"7嬡{; L|:%> pwc)A(j_;%Qk$-h_B/ɩw#FF){-UǤ7R2Hx͛{@p1׸i{{YEbãAߝh [|_d&a\7}l Uת+ vikaL Kl/@WtؿU|;Z+:5x"\^atl"[nl'/T^m nY6|6ʊAyePo6yS5ä%LӁ90ʬsr"@q۝$M8Sosn;_iG0w=gR5$JΑ'&<"" Ŗ#I,nݖVj2;ey{GJ*hS6 e9˂ҁ؝4]<*0= KyǪ]iiZ.Jj{@,R*{~5oLw=tPxI$BQR(pi3c:;@^D6:穵a((T_,0Y|Hxa!8$H`y !ceqc*h%{gqoR0#4'(xAcnz\lUS.yݒCz2jsYĘڦ;hdh^}ߊѧ] JZ.*0ҺtDh & &q56AValIJ;6'&Hx X \%qސ]$\=Ab`@`U}$-cWIJ:zp%ST\%A]WI\%@Zv^JR|p%T\1Ul_ eB:\t0"\IDz:w>0BSV;kxtP;Z耔0??T?3CoVQ>`0LЄon{ lo`:+Nj0 R*Siف \%iݹw$\=AҜqEIC? ݛh\峢9|t3N볪tx`L{)cMwzq:jcѷs"i>sʍ N݉ZaTqku6q2Cȝ!qǡ)ǧR7I~m C~ʥx{xR{nsTI),a> bv\a ?Ԧ-EǸ.g#3!!H2`K.-ٻ綍k`K3;tx)1H8= ( $E.eʆ=)b8Ƿ=+o_۬snҝuMӭ4ߤ>+:^xKャyaQHx3jg>sWEp4NF[ hGm~~P\_n|~+ {:aq3( {|\혻GFQOdL#nfQX뵞y;'Xk=J {Iқ;L˅y|N0ˏx@;)aKiNU%ARkds{&M\ZY NI$aE9emXT0Γv`r 8M(rne$I1"1rb KZ s=w$lv.zWp2-<\;7[ ʱ{QqNFT1o9 TrdR+]?aCJ'>HUtL8:y3{(b5֖YL 8+}Sm^2P^mLeS)C> ]uHwx?~dolWÃu?(Z9v5{x&P7mfC~NۇF1(bǑUG79}9Pۃ݇Kۃ@([?[km8h*OJr)m2mo&֍/nH-ZKio60_%(0j>AoaR>/>nͶqP8>q6Ï1>> XFNkoV[M[аkB=Zg<`y w6m>c誔c؍OOǣz'Az?D'ÐT:ӳn6K&"oU}XӋF ϣohi@]Ԝ $!K+HVZzB^準%P%Iuߍ ҝzO< v{)q&dz񌯡/g3UZ'}]dku}'U bs[93 K[Lm[r.sMNr@p@$(/tJÃ&t0*Rp4>\S / P0vY4a#A2 ]%fbʌbi9hnh]0'C<6l^nه4Bx֢eOqJ)5&RF;+F-fs!k5(X(="Xnq =0|``¢he\;6YPl*"bIOOT>>__#m#3vdN]q ;^~| ;h%g[ŠKFSHpF=lt%]rV6Tw\m:vuUN;lg ihS`{oSܦmt@]oddUkuިN:•y2ev0!ôK uF-'YA//x?_RÏWW~8I8YmkKҜDpDѬ@x,9,pQ<~|9̬XgA >-^;RIxfɧFg?͏?)յ^܍ƞA+ N`˃W=0҂sgD3a 9Q)?<2-169xn Q{TDz'?utuIP*ro/ jQ2ii7nVT:痾weAE GCZ"x6R OГDGGQȚm$ r62уO*Mɋ*q'h8;>1>5qkA}8k~Ot04L{m 1M[h*}T*>m.wUN~=>O:!A}kQ 4qMzYPՕٶnU%\w ikaK.2wy|4w7!j4ԤP\|t;~8 ?6SvܱH/ͨJ3_eePƯ7y3I!QiT*\z6\:ټ5 P=:Ky4&RLz!0K%dEKk<-`(0,j=J "Q(1qZ ~b`GCpI%J d o ((sVA-m=3{9A s7%:N.!1ggsfz=[>3N NzѧPrT{AcѢDK&e͸UZ#13AK+*i6٭R#lU۩JGq殏޵uEȗ{y?ٴ{Sl[4/3F\In^~9G?4D8:GgxfH!`?ޣ'dcZmdmsc `0xƨt30c6IGSJ+' 9F-[`&Nhky@B$S2:*W&9[PL9k|n7?3*tf~4.]^x8_eׁ. 'سGhXi-*[>{9khQS,MU2G& ~zҭBa<^9$}l(AY'K5'2;t1FF"V'{AcP>NY*º#m+N&l=Qa@n,D0 `Iډ wP? .pښ$oJk#bF .CHI#p3H)L rmDIR}j ,A &9FFz,#?+,qnk v=!?/4-J=ys%X""'8 >gqRu1C<~`FO<r]icO/ݩT'uɓLr&;r:⎜b3.8:gG'H%bh4&%xv=LuaPY|Cr7Q[`rkݜtI҅6?9.\q[0 S't=*,̾oVxq~Ԕu27cֺ=Wsλk+1/h8-:` (s㝦ww&0# =T-Iڅ#)}Hgы_:#AL:fblGWcvϺe/gQ[>%]=#El棎ʉd }1ByňWs~;ɑ=VĒ_'';=?%v/?;o+?߽:9~ݻc.뿿>~:#p5 ?oEf ߯\0=k|\ᷫ*Ώ^ӷYl{$іUpEN}3)$tG>bᄘZB0P\)15_y=.m|l>.V8!63s {yhzICw|SmsBТ g'3chN񢃱f.htdb J%U⛅a0(nTE4MdJ ρ;rts0Yg&]V9xb'w>xtd VܧɬZ`ϋew\KWH#e_FM^qpUjĆW j_M6GQ{j%T[/]RVo*g"3t{mDlo9ܽ uK*<~nRD}V|8u[c| gޝ"F+8I]^[UؕĊ*o=+X(զ6_iHKKumaav@ea.ڨD1/az#W/y[46`d+TVH4*ݸQ& MɫTN\ʁYjFE UoVCc@u96H<<Jj R.32< yKMeDJsN'\ dgQmBYBڧ*ɎKpW Sc_O*;=80}.>jBi6YxdM Q4,hEuʆeڇaٰ J% ,1L { ӁkI(gO:t +&;YL,k uV QD$GsE k Lm$j䬶aok/WZS`櫓7HfZYS+ )٠S3ZfI3ʆN5v6C(oW~fϑ!=B3{xfJ1IvM [XQm&.:T—X((O , FQ&r0%ҙ|Y]LLK%LDJ`pK-BfX5ӗ1@s>|'q zڹj_vuq0on%!5b|:g6CM=1NNkzk@7(Ѫփ%YB0XW:B։BWP84[Uo9AϹ>R^Cp_y%M27JήQw5cR ~8IhB3LVnbd>$$ʁLA$- ,/D͇Dxi}TC,'DKSFt$xmC9 Ymiک E PB&!aRIjr`^J$O+EccA5A?(c$YLRp3fӜ-6@M[c'c)p@u A [kL9s L!m^WQxH*7}e,z0BCdX4ƨȄA'd:y'by22V .dp5?t>R8JrlS7i0WMʈNp|J^Ԉh5RبMCwnJ!dTzbYg>̽4&e-zz欎u}އ85^_N ~VGeH龲e~!a Fu6Mr黷EK?Y_V,)>Sn^Ӆٝp=Y{aݓ+n_̷ɺ彶ިB,GmYMVuNݕ-{t:E53J+̅l8 'Lם"WM ^b}3JxCM si穕M꺫rnURR3K$Ж[^>{K wKg6sf ~; )#ߟv+jivwK̜ގJ!8&h)3LeeND Y g`b<ԸZ>($i;-ڑ'[o:7tIOf] Җꆩzol͆/W~]tf 59o/wh|_oL Q'нϪ-ghGQs9I%n75o6^zT]~> iЋ55 g5#S#@`mqdHѧCx9bSrj6eD>H5Ôg,+$X:i3T F jE Fgᘋ{1As 򍓷˜x1RTNFΚᥥ*{Φgqw#π`09F<>hM.f^<߹M&hc KC]\Ŭ_37$$y-aP # RK7Qh-sP9O-y8lbma"'n 1E:qɦzW֋zq)hRn-MF'-*33dY%3Ap2E).jܱ>L@ͳr4Ƴv\ |O)M#N;$k?NqPuԨMFۤx|4Y~&7r m7<F;f?VKV>9)2cI4y9`JTV@r,R GۖXhZCnJ_@ə|qyYWnO(LgGZ`!}pdD5 oVLX`10FɤPxypP ,gI{8+I>6q,YASbL [=CRHڰd{jꐖEJ sT2g_Kʎ*zr)vWJj߻(_6ǑO< ddyf{i*nS>Xo07݌{q!+{_{뵑ݸZ*00`(`(a̐2HWGo$Ϭ([y36ٶ %e a3#Nt;:k:ff8˾fz8Auwì* px8\Cw?W|"6aTl À+ Ax+;#/_聑;#b1 #ɉPydZb$%?5XG;ѣgA픟b]oZhvS% %id뀺fֵygodGJ[L<pIJrTmWT$ AAXRJ Ǡ}DQGDRr;60$!uHKd}@sc)A(j;5 YsW v r$GL7ãkT|<;&v g[t8w_|8w#_%mX@.~o0*çG\)䊇Y5,=QɅO BfG(E9-gJ:w3?5h9 GW 33ʍ uܖ~erdmp7ǡ2"mQ tsØT \A_wd!cjs,O涁4RmC@yU#}dMᢜXލ3bB[bv%cEJ}|1#.WXUM,pK*1su}fdײb^?rpN+ _O<4%\~>l̓)绁yXLn`=3 g'17qgIoj4xXETE?y˷j)[+:UȒ;Py<{ 0<ɳe·1T}j 0^P"!+I}bp'&ڀUy'Ą3xSsOFg85nڥQ}9PV,BXR/̤+'Ez2`HLMdOuB-p6]&A2uu2u4BtGkfـA0gor$kmzo6Y{v.|_q?TCOmb :8ɮWkY;:tk\s+mŨ斛D R.dmAPk^:; ͈1:&O}yDg[K-6хyg\qoT+쁍Kϙ$F P,0Y|Hxa!8$d2XMI T(sVA-m=3{9A NI8n|yoY`|~шjsY4b7}r@{b ˳_/%xVZh]`JuAɫ$ "\^m6PsE(}ȋdBι`1X>](YG@Zr;AfN]"n,"&XqZ,/ VJ Zc!ܰC!fH"8R$KyG <@Kʶ7PXtrn\͟B6Ԝ e/L]A6~"Lg>,[|\9x޶ka0˜`y(f^Vm[̋OX܂bMSlz椰jZ%h%:mrDGDܛH)K?~v J+$T΂#"As"Jwd3ʺ7~q/qD% kY)>qm8 vy twKg@Ӗ/LևL2s_KۄhyջbF[y-4t-n]k#:wތCY5A!LlTӚUSUhPT&^{UdTʪj@+7`i>'Y#QVgͷŭ35;s6D>f6D5uQ&I 6ex._}s W8] 9|8">*&)9ѠX0'mGܨ7-!2p`0Js _ +Q"J:#ҧ\EVsT$?2Mlˠ$qDYeubV3圥l<Ec-L_hCHHh `\a( +i w!zK Go88,ɜ*-cjfD cGDEwhW@W0F DLWۆnfah&|ʶyV$)Xf2}*0dV2o-R;mc[.rp*x ,¶[˺|e|(ث!bIbn܄Z,-RA(<52n0hN悍s BFRlJ9m8$\.%ȩ:4QFu,>wΆ oL0!Asuۄy3oSHMn߶lbWYrU!gKGBFJ*u22Hq# ;Gĭ9I`_MY+ igHP4:&0â]i%1p0@AP(@2>g+)=* PeBOYlZ)pZFֲ`'Up(52 CBQV.wQyvMI ȐLYRE /mJ J3pN ==XѲ'. +\ͼ0ΆLamm`ָS"¸AbkM.MKi{*U2UUޫQĎ+dVV $"a;RLݞ)pn10yVdE-6J)CCzTSpg@Hϝ6iY- Ƶ;o&" xsvaJFRc 0{|+9,eUnL/F`feKSJ&pWFW &5ս{23\jy]6y4}޾rR9Qut ٜxnճ6le#Y#44&p|\:U,YΆ՛k|, ȊAڿz9s38_Ջ_GN~~gN0Q'/89z nRVԝAx']ppme%{tR)/wɧgWtsS|%Q1*jH(gy.vZ9JC"DtB`Xw1`^"\r˼].>4ם(!n'tV1jګMVYZ^2[aFA3l<6#בY2G|s)`;"HF+4<4w P ʣ I"2B@3ŔXmD!r0AF*2VN-X/#.vu)!rcJƔt4P(A\PKn~~6 x'D:2)dLSPza%H!!R]JHH;9yt3pfJH%R1@E˜S֐x0ȹk (T.ĘWu CdCm|&PFRdy20:;He\;oCJ:Ԁ>Ǽ ,p[S rf+3aÈNF5"%Sͱci3߅:a=\y¶.n  GTlє/ qa偗/( s O|W-cfgN"³uzEI%ZQBzfH)}ΪƺRB7㿱v[hGhg{I- (9H}i^ׂ9}okD+70ܓa(:}.T<_7FZ)?cb@BSM)4ՔBSM)4,n|aP^=9,J]^r$I6MLX}?꺳FU kZ9{o cGABw=Sws Y#Y쩀uIPZw5=GH1HpD -_,GE}[,UFk䆟3~j9QQD56'ӏl깰 |~L" @ۜ괂9AǛ!l& ؐtߣpXG^瑎VqIFkRCZ SQFgyFtz1`W(5Zmfq`}|MzXC/DrrPvgö Q=Q=ԧzq8>i_HqTp F.0|]"4FIV\o*v/v WnZ}R?':US9[IIa8J ͸X%>gɉy'޻TJjV]+[ъxg ;,EU0Hǔx sA9Z5}&A -5:8H# <\װ&`ti>靹qYbZ'w/.kЎO жb?'N7zՀ̢j,IȑdO3oX{'hw3Arn"AAw h#hۿzm*67. vO`a.Lל~ПBNHϓ#6'5Sqx[®_U"e `i-MΐAQf&dRdꘌ!+jY .%Y2 G1Z,F3.{gBU&ΆyϦ|HVjH+2R54?tm .c4%i\ߠ r{Ia֍i,QaevM׳UxGgO`ήm,Jƣ>UmUi*v?#1o%iyljPKfzu4N/ڬl; lfںyFϗw>36yj}8V7:)3odWlVҶ޺҇p!܎Sq,p>f%YIڹxXMx2`ǖ."C%R.Jpl7"8 cȪ%VT) d'mV˨A3>` )2L&h4P5TFjGjO8{.^g5-ye\.vJI,%k3Y+(=:Aa2SČFM!R]Cjڱ-x(@XE8Tn:=y?[ܒa7 h:8;*bE >ǠȌfϳ^9!eU` 劧&TJ4̐2u+A̅UHMd$q.3[j&ΆNޅ|k(R1%l2o:f} 0nQ.+~XŸ#He!,AH 2kcNd5 n&BЁyǤcK&a'Pty0t`YvCBh8`9$Mwm2^y^J`qSqT $ 1NEK0rK} w$))B풹Y C}#s AE GCZ"K) BQ'TXI@DGGQȚjka ;3.7|&4Eq֛OA^̗xlXٱ*y7| 2> ptF1N[d<˾JkXrGQTU9H BJ*W֘'mHy6\ jB s{WRC2(C32+7FIs;w>%ЪIB^Я}rYKnƤ'aecX+H9f[bapiEduՖa箺[4ƒHy SO)ⶓ54܃M=Xޭ3rZ}1´YHge3JxdF?G˪0Hiw]Ui[6o 5|6|2oY 1"{9'M~/;uLP8x:N9t2婚~osݚY' ݛϫM065]kN Sfy]JͿ49ZӾM>gb3|WKOGIfmxqkܴQ}9PQٿd!ITUƦP&9`c(<7,^-f) zCtL'k7Zf"ffo}!nM@-svgѰFђ'XJ򶻳Qz-CkkKfQь_z*lǨ'QrAh+xk#h>4F fb Ӟ:yZ?i(Hϊ٩V>j-4/xsP**)iD$E}<-,X{:+F}~|WTZt*'mhRMvQ d4 xxTG@;{ӌ0[2NHi%e.R;`g@ Yo<]KҙRDTUA%%%xZ Za`>$DTzQKDNCb&穵a{  /q4d^J+|\!c C[EĖPƽuH ÌМ]s튜m\'kL..5ga<7|rUoyhߟ%P 2 9zT{,L1<|>Xr.`9VNF.Jk/9v֖RKJ1ÖJ/c5}1t#Dc{bMGP:}a?yr4L oc-[ޱ7Y^3%^'*-K p뵘Eti*ɶ" Y^:M|=|rJ <[wM^&y6-O8 Ͼ?qzo%@vmf9 U`hi)1HW(w1,$qrot_vuŽpaS+,Q^HhwH)~̑"Ҍw 2q܀Z;WƧm&A}6&~Xh#J R*hL(mھ"Q$(8DQ搵#-7_GW&U 6%d\oGm27I.pܻ3.S΁KpaN(EKfM%7I"L5,;x:Iȇji5ȥ< Fa#0b[A:"\+rKcr "ZISmdQ0kZ )a3r6U)$mҍ_>G h~@Psf<@G>ΡOnRh6@]THd%2E.K+4,J1|+?z؋#b Q5H-qpYl\Z,ԤH-gix586 ʡn]wwSYSUEЏZz߅)y]xTHC=2Za+c61KQ!q"! {TFWڐ%>p)uєQYܧq j{gwS|] urS@R0d;1(ʰ;FuQ\}dΗkEH :qL[U )d4&jci2>$;վzeY!hlz{Hd(HR8vy^Slu5mH1$x¤5H`RGTRJmɌ%(߱LiM0&_FUinu|fPBxV}+_l\C0$'d v[jHマU՗>bK-lؖ,#{;F,fIZ$iUrN$ioL{ Z>s|is&ıqiUp%)ieQN2}].o5b;݊F{Vb[1a L`y%f85PN#v!yi?;߫L,_ fu;%ٻ1tAC8#<#8 +./mLku: =eM8ڛbv1g6lb hIpit~w\#XYj}u3]KweS5>t.9;9Xܕ ,ptXրS@U3EAB;(u"z\4t~a~Ӵ )DD[mUtRߴ|8aIq#25x_黛|"5z5EULLv4WGMʒo Qǚ ajCƸґkhܦK  Dx)Uu2}yf?֪':NZɫE_-/00b58XF@¤&]Y|W뿾/NP b_4.@A8>âr\̭Wad[XxiBv\jBC PjВ^4oĝ0/.Gu!m-N=酃;%U99xO~2իtz1|b"_@Ϩ.FvR|˳nG\P;fb +j@YJau.yr'qL׽ ?m0eImsu:(G3ay2٭(AlO;~/G'a4~ܖҿ8 &LԤ8_U_<+b-P!Pp b4%D㱕vSO ׻3V~8ojlFT2ݩ=2wRzBC?isDB*- yJ%,A ;9nGi<`ۻ"j-mzD57 H)QH1v yj@9DŽRn&x9) Ȣ`'wxlVJHH!\@G+]TUt宯;V_[7U1Kd$Oo޴) 2șW/ XBw9Rg7:FCwmc4]3.Rq$Af7r;EX۱pZ^Z$w1ǃe69k)L@%a4UTs<&9NxQ>_$vnbW뒍0|W 0m'Р`9VGKe$)V4EԒ)E 0;q\pMS2t5HU/ KE d+D{rwi&ڧ T'f U`hi)1H)h;QJvYɸ8",bBEuVFkCJ@/qmf2V.6B#`>Yf"wĻJPB S!JW U4 8(HY*zg՘ Hj^ˈi؀© i,f#gC;.wdD~7}OJ:RWZa5%B Nc\^26Nf mvM.yqݦsmom^jJyuMí%owNs^jʚOnZknwwpdޠ7~˅-.8BN6æbzS& B84uQ)/t޳ia <* ?s~]`$S6崑H 59j8M[atsquw' V:۵R;ҳڰv;6K[Htd p#MdHp sJ2PcIG5&ajj*"P2zE)5)&=)8Р-VgԤ?&HFlG|J6,bʌbm"IKL ϕT(M?Uǎ؊01oRt\;QMQ"FkM9*(oD4EAu!){D%yM* %^Fґ0 eE-9~NۢΌںCN]iWh)O9cN3Ғ 8NcD1#f,&8f!dhю$QXOajUdlZ '`D[""GymCp !OHY! jU18 1q@, 0ք&FHHqbKFbS,P% 9Ф H\JSm>ufmqgEb8r4&3k=QEFd$UAR y.pq_0wl$3pѼA\?*3UFp=Y?PeM `2&rz(bZȾgMTbe3RΟ6la0z1 9Po)?os$^p.d4kK,FtcgU}vqdIG~?d5mID(V:$WXaɀd'*DN4ODC9Zw!M $yv5?YNQ-~nb,>\bS)B.yrac'қȥb5G<3zBO< yLFXD!ym@)@Bpvssh(Aa52BRd@> 441 J=U@+B]o7wcz?<=g~(W__O5|dF]IeCq[:K("ϑq\8!a6-6 ߎ=L~)k_j{ꕖݸ*LjVh.P-BCjZl6h8o~|g~zKkVKO仾7J8Yl2mó'6T2ӥ;)*q Z|u1X|{R)?q AuÔms97>|/yC?Mל{}^93H# #xz)Z9er<3޳EtJ@oD3a 9Q) <2-1MmIYNYX%kvS%(4uDH3}Գ ݜLJ[O t$5."IAPjDj1,sw(g Kh"#!-Q|K) BQ'TI@DGGQȚj/ d9 _K}OX3|U|o8 P;g8dv:Ўߟ?jbm>>#X.Qo0§dJ[Gx2ìL!( d'!3+ū;U[u"[~2z-OsD @anNëU2A |1c$H.^3f?7?1ٯ7& ^85_j.iQ tQK1I4|El̵6RHi^\ucm}/kH-Yk1<|bR ?me eMyPh*+,:W]+nT2Vƻsq&l'8U50PI.HomQϿ2<$v€>CM:y&90oG8?" ̗}';*@z]|w'5{f[$ RLs1Vnq켟 յc5cEgctF*_yN>)I=u<0USں$^)[z(>IhnƓd}mu}m%}i'׸^d`n A[RL@ obN7Jsr4ξw?//;ڸb`*tp~]gvU;G0wh[q$M&Kh[xO:~۶0jؤMWN?ɽ&Mܘwϝ&"r>wc$H~ 06 x5~J%R!EVϐI%ġEyuMOWSu綢>jп b),)!JdURzh(_$odsЮ29ȆBD_iwzy-"W\2'XI]2{QDϝ!<}:+566vLL`r5؜} Cy/Z{4gS-ޖ&e Te2Ԝ$- fODd:d:ud:=O%h\ 4Dk Zc(d  2&L:H]Ou42QǸָDphiJ#A4nS⬓Y6/?Lj զgvcj>Tp Z~n`y~~q Zj: `*:bxuKx}|ǫcEJ?kYsn ZA!ė-N#uԦ֧t %ߓB:F7],c7T#;ɕzyn-q4ba"4ER0>$pg%j%k6ՄHL(Em18(8N\4 "F24t Wј8kMy@7_ ɒt|S+wqIe7Q%5%bTeK>#ٹ$*8L2:`Di9&eU I+M *i9$X8oΠnL.g1 1Of BS7(~ZN [:X c7|Lݾ3#N+& IDyc6 y"5GHRCj?27]55HFyY(6=<,HNC^!n"q+RMƾ̦=bm{Xe.7?Ԋ !bͫ=C wݹDR>LeڜaTU+)c7gʦc(+q^LR4.TkO /3u.xbΙ,F)|R@^" ;"gJrN?At7!xaѹZhGVQ d#P(KU]7`ti=6bwXKQa;?lNE7~V>ƛVdKسƿWYѡ2g!Ԡ2/_*lrES_ƹ<ϥ0Pw]]ԗeFZ>:|td0Io;t9;۾Z(¡:ɹN:"v_!ZjiK t˻FwL^܌;zpf3\r.mխ_rYK4r hı>?:RWď!-z;يJuwq8~ON_uۛ^;L|rw8|=0'å"Eu#&}p>yI 6+/'(~]7jz3*$*ꃉ)7gbޣP-ɺdUgP#vlC5-&h.SMv9.N|dҮ ш!3dfi'd;Oie8UR*(ɊP=4Z6?]1u?k+ U^]}+4' !&IZt04dE$%+Tp- DϾ:2eid~! B59gp|;%T3238AzR&T#PfhlL<2a:lj;Vj>jQRhWƗB SX\!lHe.\WQm_A8e!fkOQU>*V1 mQ5=R %+g%y7`p3!0ԡ'5Cσ\U 'bK$с>iPbX]81q3HmodOtIN>s齇ҍ2$ܡ25z vU}m  <=:\ -EW7%Ӛ`+u$ A"\H=7$RTђ@=]a,xDф !joYHą@ŅHeSl2Q$P"Q9?dN`l1qDV yVO_m -g;Z5}ƌ5C-;lu?|n)EZF^? UjLBRB+k+CO'!['!Z'!['!['4=4G #:fҚ't0 D2`)kLHڅhp`2QY`j-mm"aؘ8knUMq1W,W/{xmQ.sikbmax[#a3:9AoK%\0 O"^]%%"A57䴳FFC4R9XV[\.itSƄܢR%Olܘ8kBqeas;_/E0XSTN bYnɳF 4/i(GK7DpSB̬%UvE/,h-rNz) D0DJ$V) U!0 ;VznZГd+WyG؀C}$( RѦdS$uER) O.,y}V ʔ7U(DHVxLB+">&9bd)K1pB$J $)Ť Rx3b9ܣF i} ; |vDHPy9 1EtC@4*Y ˕YE:#HyN)Wm:y: '2P3}!NJ 0Cʈ`4K@RbBu9ul=tE/omM/O 4U=͝yZ15ACq+~_ VGiub^˗:xG]C?^f#'m/hXBt ?lqU{6@qǢ޹);fkI=GcqDԫ~N!:{O/ O:3B&95~`XIP&sJFƆTz{6>$ǹ+/HC\l@,{mF,Fs6J9C@ZD0f[#PM). YCT{) y< YHĔN8S&&CT{1;v~Wڅ}%(at\` 54rg:94BhS4Opw}[s3ٕb[럇\/3/gηekbmk{%4qҋ̒h&L-]lD.d$[dsB>R˓rW@m%%VlD(B5I5/3M)B59jRgte.g,B!WVk5IT`Z#y{)80*w(E" ,w'E"ԮHdih{JɉڧH t&93;dP` dgU]1WYZ0m7WYJ{sb̕"OAU<.b֏ rT8B(?C| Cc=W ]\qG9ˎ=aqEq02gFdn=n"p+|U?8FKCC%yZF~aYYys}6S`2`Z\G%VT()#+}o :x][)=;|^픲ڜ5i#E[gc5[K9\E?m'b\'vm'wQ1H? b[iɽ^tVt(ҕ.Gte( {CWn}+g%Ӂ^!]qam!ÝȆ;qe\gf4ǢB<%GUJ^x=ߎ]5gqo@U{ɮV}O鄣5 .k!|5un..MgnQ]kրڕֱMmck{6kYbE mskۮϻBI߾ܵmlGܿm>Z7W Fz\ǭ/D}:Q9?oSѕ ̋G|0wB]=Qk7dÞ'VWM/tOmlxu{{ÿ3^OH>:}x>k; } cQKER﫩ӤX6@es;ߎNG-k?N Z-j<-Ehw+hEggk+|RgAød}@Z%tC1O*z0wH$\JQW2Țaw_Q/2fm)ad0;jG߽lޣiUEn^e{д*^jGi _㴪>ET؟րK7[7vƾt(a/HWA>{\:Z骣<zz럝@OWlCiv({tE[jס'o_po p̾UGkUGPM}Ivo p}z:hՁ^!]5{DWre*pޘUGi큮^!]qUU܏ OW1zte88z ;'s@q&8VG7muѬ:o xsų/]87:k>w8]s+W_qWNw;}npQ{ r"Ek7H?ǗqGU]}b-^s*[9.,_P o2By\;t@!5}s7:F@uPaP6@O[БJFo'ѕ]e =)<ޡ9k' 1A-)BJP$RDM+ʫ3 A VLta, hZ=G0A(辤#$h5w< om Q*^,\Ld!*T?V<{]bygUe-m,93|Y͌ QH&ٽ9''{󨓹hŧL,Sе2kѨwP$1;2y1ρ 7DR}IK[26H HޘS{X4i 6!Q u9@'^/B>,%A.|jFQNLCВnt^c8:IȒB\9v#`Qm4BgR)Ie'J!B2?A.jDao; <\- 2DMET qX5369BIj%[?zX3,vM5j3˱ ojTʥp#/U{)Ei`!-Mjw+$aQ&0l5p_lU ~aȖZl/mmK``=O:_`/d6L{q|8ugm߸0X_U)f&:hLA6t  ;,,FZ[0PZh4vfcd'$/G7M*3B:} I y XkƖRA5v# CkqŹ3iRX-dT*fX`J0dCB = >!J9HuX 7mfv6v}\ >.&S8R)P1|rՐp 9XI'ydDy*axa1+Zw[1@N-|T7#!sİUORuJ` tB)Ѧti΅1AUo<4[5gLF&J_XV=PI']5*YCL\@ q>96) 3:?oit>@›-ւ w2UY[8mP zx0iàe6ƹf@hF\u.O4%0M==>\2'ФSL4'aٗR#t!oh NCabnҘ%lU|ԇ X9eTǤ4k&$pe. 踒#P5am.?"t+{;#Sܓ`j?ZKB?}i?RڰVB#f\SEHa7 R0uV`T8VV }xV s~?fu@yWBR,3mH"EB`d/ Hv9VFQ#j-JLے] rlo54|71}ozK'5~LIdʗޥygףz5O&)êh>-u&/(KMAZ݇4ϫ>IbVjR7$Ӥ pͽ'&q] d9exB-߈xCu9*w*E0 檣YXk~@ڶH= d49H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! |I KYsgRAg bGCْ4r$X)@p 3MGū_&FK3R-+)Wٳܸ0Gg$W7/GiQ>F4;GVUD=#I T˳&4%v{D ($B ($B ($B ($B ($B ($B ($B ($B ($B ($B ($B 'K@^H`hbT ($^H%ETZ$BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $z$W!!\.kTIHH! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! H۩o%fo{_4,宛\?K};i}/;@ 46=&pI JK`.;p :JI\zҭWo9Yu6[ZхYb8ϙ,mr(zRWǼw0C}:-ߥI)ىuHMWdl f6 ٚ@2!逛z=3UOP!Zspy ǫ:o7FA0;-!@eB{rIEAX d;Yְ\Ce8Sq erP|}YO΂ 6Uf7LtLB4 / ky@߯A%k#H5EogLjxv3eyH *P],p>hPa̭5d?ta`/2v0(FhSy[||$8 M˲xY ^T j4x_eVR 9RTde%Oc2qږ0^0b7 ։ !k?l%#I*.&ӵV;R)KgW(Sx9sideD)u)2܃uκQRh-(@!I@\'*ILКurpǣ'2W!#;+y 鶃=Vȇcȅ`*er)6sf)k_oFdZ IiFR $I睩'2<&-_-pvo9THg!7W8|:OBD0;.NRʔD7 #{HP n]2e-|GU'g2,(AK8/ːS}!hLII.UAyo2PjLyAiҫsl3svĚ47?73-а_5ͪ|oT稞=w.__tj,p6nz6 ^rWRo>ٱhZ= SxUUZߛ̀B.5n7WsXotQfT%+(YZnzj^^lj^)=G ͏nN^.m"ݨ>ST] oF|rHEӜDfKl< 2^&~bJW)]rsJ;P}5rbs ; BX\)eb}ov +-n 72Jy*ipFNȱ0"TJh+9̢#Lg%;bXǍI)",!YSnRq)&n}>=ϝ5:ugalqه;@V/~S uzC *EC I nI[ ڝqYET*`yDeіɓeBRȆ[#̞frIр:\^DT\":`KY̩1 rrT*KDM$`q.qЙ9;&y xs3(Ogv ~R>5ɼf})~}ʈ kDza!@#ܑ9^z=sھByjP|e;FJu-;%/~nQY[bvYcxXvA*\#sy4FT8BK/6¸rQIJBtxFyDz<5Ø:k"OVZLA['#t A5*G)S* &FY{E)\.;e4')&i- B㒒d6r"t`>嗽5?_!/j_zoz_/>bueiR\63g-˴ dy0UO9zę54rpq_ W[zjղWު9L-EȼdɆ u -66LY>j<4ko߼<}!׍q]^[N/n .HٕiPDSb~EhwY4_ڛ}G{mўܤKӃrg p>]e4.d4<߽oׯvIoG1[s |wc7:$t߹Zm,_^zPbH'I?{Wƭ)}IC>Tڲg$/NQpJ)R[I$R9ȈNE8C4ݍ_sR9ydZb$%?ڔ5XGաCCFϬ:? 6}`Ϯj5a$>d5^j AaW'! !6p& T"}WT$ AAXRJ `}DQGD&z54x/A? &b>hYE8@XJ)dl<"Jb$2<:jBljAr؂Sa?N g;n-f}/]e̎d).zSc%xoݍU×2 nGS)j𥱴@EQ +݃*0-'IV ?Q44P`* AqiS1β ZoBK ^xzo?\tARZj?kn[ĪɦF˜o }.ONjtMV:mwܢ1XtI80jB?FФl]i)s+f+ܩn/52ԞiLwYL\U50QI/Hwϑ}f.9ь:Ó*]q͟f`" }z*x&90oBc]%Ÿ)PYf 721GAgFN0\]wRscdAҁʳT맰 ~C'O9PSu绣nRƋT-=dV 2n)d0>{[31M^BG$i8kaڅS}9PV d&I T*GE Qxn:pxyk]y] z-%pnr]Ȏ)*I2T-HIď[}B-ϛ9[$ 9hC%YϰL/3msg%Jz^2yW/K0dPЖL,-Py{*̾p-f~J%Ze*:uTxkqc_mV-&6(٤*WsbqN)_l" s}NuY/x3hvZjhF/=gR5$JΑ'B"E"5NSc {Ƒ$'y#w݊,_ l#lh Dݮ x b94ZᔍBAa/ބ^7Lu?5jNy'$p\*HAmݎ+>K`޹zĻR6^{?/;ڶ>~ÂjlEKyB7rΣ wlq)[@)) >ZYK0X4DD&p TQ h:vF&nI}iۅ땱)L߼˽G850`"1 5B`8ar0[TδF- mWz-;]}C{|,03[ L. ֚`CE,3h"Mei6[ (WKM)0qZFBJZ1ok䛮a9+Nc>ߡ{׏#H0yAowaYU?t.fM:S!*UR߂ei\%VTQJ-#i1cRa]߷@%Χx>|9Zg"MȨ$WСhtL*"`E\ZI L((O;@2qTNX0TDD(zͺ 8Vt Z>Tkd,8$='*7]w:VHZ`j<8bt/(LxiTJxTHa#5։;nl g GYcrkBh@pfG펯p$9 ìf A~&XqKi3 /Nf02rJ.%/:2K@V;P@[IM^,t^ PQ2-&+U^cϣ GD)eGR{up>wy%IAٜk+)V3 v6DH1+Gpz`% P\BTBvc> Nj  h*_PD$#͑,r+XaQr%9x'v$!H<x Gx#ydFsiSɵ昆QXf4d Ppr//S ~Tkf\ΧJR;+h^#C_XE`jdZvڤo ɭ׌[G)10iThG"h@QAsglrD`:%;A lix7F;N=o0.?(=.wQĮ8XOeY*TdeU"LE=yf#qQUc=-i~|f2]jkͨSDaLE/EYKi 9c՟ttRlG/+ >(pLt m0lgwY[ner{\7f7QRi*㿏[zғ~rq05}bq r ~bq0jyq0*5-qE A\X+ X?|mJ\%rq;/3WD0IC{#J *Q)A\=CqE‚ {#doD-.3WL""{#\A+V"*Q)sW\2I\ jo|W@}W;P:g(D%{$`BwU{jkWJzWQ\^ x駂Ŧz(glcyNř1fc,rHNX 'f@'ӗY7,_0X;<8*#䄱L[)D%0돛1M'>Ѱ) vm8A768-`2讘 ˸ S}\t{aM &B! Ƙc݉ot3 ,'i[WUZ!XminRK?ՑcgmI 9Zr;AfF̫yaNg(?˥EǸ/'q nc10 f1gr9+ڌ}zr \k6bYwwflJ`RGSJmΌ6>imw1 rٴ<{&)T7Uh̐.!+s'$9VE&rS'cEXNQh )l`kZfh뷊tSX -G?wۿ`de"tSJ:^e\/Luߤ~W"5KP<@SNGlu|aQ|"l򵓬g nѝ8v[]a!29@DrH٩YTpџ%-[嫅/x*+ dp/ ~;@[ x>ɪTjS=ę2z[1< W׾7o;|o26xޖij>ufks[Vm傫fɭcSH}rJ1ٟKk NꝏI7XܪЭJ 7*+F\v]\%*9+R c?։GcoH\%jw?&Q#KO[Hc^z}Zq0jkGJøn2fn;2qOݞ ~Ȏiٖb!HQXU,OzX C@\-'BuTLCG-:7 _*V*PѮa19?;8` H$*MC` Dޕ^ y`Uo6盭 e|jgl⯊8E?.,ܓdE7s80JRQks:A 10zL/ ' E^dÏghWvUt)sťYE+hwRzl͟C5"P dt}]|RzcKvB5<[GY.pZӾ^S#7ToyQ6],y}V p*A"$OΙB9" 2FxRk6V;|w!2aE  а4gl0AzMi])H<D굠R㔕5JHCH vh bI,T,dX-X:YO9͔1uKEtaVdX 6@uFTvߞQ\˲;b>!IRN lcOo)Y6Ta:|Ĝ/ ;Rlv;b ˖2Pp??c~jwpҦv^SCL '3us$?fqQ.4كcu{϶F /y8ֹwH 6I#5ۮ >q9-|ip6?VJ-ht6Y˵pAb"Ao/q0;c$:'-um6Yer8K㧼? Q@CUrU> sg߹?^ZN\B.F<Nj8̱\2?Q ݧ#,7u}7O>_32ؘ~خ_ؕ ˏnXnj7xcY`2OϢ {6~糳˛e>Z-`۫a ht}<}6W|;Y楹2VU3ߪ~`n=0Ж^ZrҸ55[B |T4`p)׋Ԙu;oMd 552 \ u>\E Giө\Jl4TA҇*&/h} P8–"T!Rs,PYJJ3 b>Vvl4l ooy^cŬӪFR-B+'%8fu}W5c`u~&uaF  4khu:%4K`O'hfK 6ǤgO+=w}~fz|s-SVcM1(AVm>!$ggeTQ^vi,Rv\P6.`$'Yd04Un[LЗZ&ZMb"B9TX(9Ӕ"TsOY\Jm,sh}`Fc\8scc I2I*$ ZĜSɍ*qacq,ꆹP\\xg6g~- k0{g3?6@aGp6[3$< Z+/W:44g hi ĻHI[T"ͦ!j^ٳ#mr'!(fJmHJ7݌&Ps1[MIDZmfmӳvonS:p!P%4/7F i-/q !1 :$7ʇIHT&2^: ED9 /Xymچ18t{"dbl #6&G2 2⢾{F 6d+BD# Htt),q^-gPj"ZeD)gopjT,eqy Qxޣ'@q(AÌ݌"bW;2sMƤX^ "yŭ; 2eHUFSP;a!x)!yRN+4ECHQlϋOmƤX>d !;L4W- ~|ǜVG~N)xor2}`.r4>.gqh%}Z˭Run^$ \ե8|з}'U~ɛE ^( KD:tіEѻV3#>˛ZxoT%7,nݮե鷥3*?_yWˎZT<̯YE77^\Omn o0[/84n'ԫyy:X|!T*HrH{~*HЂ' κ||Jis_jexokS ˹[ Igٻ6,U'Ɏ[ ,kg0BI;rᆵ!)jJMDvuիzGVaqF2eE@V8$X`8vڡn#%Ci74jGgWW/`=^r],ݿONƕ\ 3c[2-o+YsC8q۔/<fn; = j+} USVV|5q(۶o^{/f1oQ:Ӗ/^=v')7O?] z8nZIQ34]ٷfMoβロ'3YzP趂GOìr8J} n<]Ք<_˷~Qm6t͹ Aak9nAx'ct#×z`Έ=g)#ɉPydZb$%?5XG{U 9~Z6|Ӳg^DvQw̺V/6Aa'# &6p]f Ut."IAPjDj1,sw(g Kg#A8ZDD6R O0؁, Z5m(Ar.AsbD}/X3|}8^dgĎl*ٽ gnX[Kq"Ge H[єxD3鏏NjXϿn.`y@8 N\4Mx?h`e6ӶZOTc@I?nJ̵{a9c'E i&aj..ZmKƊƪd<7JˤYo\s/U;/Xr0A.6p:>}u4=Pݝbk%^b{j*_g_^ 6$-Pk[2,8aՖ{,QC 斫D[&n8޷ ZVez1Z&u웫5{5Uyt &"P>*kkWXhofuAIIb̓(9GK2YJ 0rHƞq$^I]g[F.e,:0 n'DSk>Ζv.ޝѲ2 `1qZ@ ~`GCpI%J&dwA7=V)Ql=N;[B7zf!% 3Bs4vA+rq,./oe|q@,*1}2@sb >~>B.V..=^}$L7;ūU~hśp1䖩.`ft'qBr ZkiTav`+o>+= nzp5*CzNq6HB':HysQ bw*mf ^T.9cgҡtl(CAC0P.}Ƈ ߆A.{]'6s~08hoH/㻶 ?; jN\MT xZr. GR=’r5H0)K?>n;R!<;WHCsKFD_AsRRs46nнݳ8"l8% kY)>qmX ߻1Q8Jٻt7͞ݝyc^8e3vY዇Vr'pN5h#J *hL(mZ"Q$(8DQf@[/aaݛ*{ù|V9g.H:K^y$x5[o8qZ=E??H5 sZD)3l~JrgJh`^R|(F>T#LA.t y0 c7' !^)X;FsL(E5@49Jj\#1HX+\$JOH]ȹ]B_@l.x퇼WA#)ڇAX<ZOf" D8#͑,r+XaQr%!{&络N1cFjbKb!=Bp昆QXnS >@G. c&Ir4 4<.){d4(*{ dzUz2*CW9G]2Xp2 ?_e5'c)N+E.~̫|_W$s&hr9sVE3-QK%ekJ!v[)}[U |!=0lǣ]SjQ oZlVt8˸L#'sFj#)6Z`т6"Q A2 ǨUI " | G@J^jFu,wFΚ:!1m$Jz.F iԏ< kn6ɿkVu(eZM˴MhBs,#֖Z荔T %TmL[q# ;GB߶@skț44j#BQ"P4:&0âJz8APvdv \P)U fb߬k)XKGFֲ`'Up(52 CBQp \,jrX#8pDi`oIͱ%cT(#I9E=uF$\F1Jxσ,FG)"N83T8#eX"D@?dC;?k.c-?Iʒ< lUFarDZeo{ P\;J:bʥyR"?#_EB!^'"v]&:yma a3!L9DgRP;0E.l8}ϚJ`!@B:mteTSr HϝWZ2_aLXLGФ&p]ԣ#{|+9,.lU/QLV)hYהjn&F pѵDɓlOu*nEc{[=yU>x6_<g7D1 {gR=9E,qL lm-]5Cڛ_,BJ_:<[N t1kwզmU#`bjeCGRÑKUØ C|auq{JYQ5Ϫޠz9,o~}^N|w:}w 8nuoދ}1ߟ9Uzt)eܔᗳJIgWvդS)oP팕WaCR5:9pua+2WW: Qh*v\0z.]v .|B`̀@gvn>Z5} 6d>Bk${UJA2X){0ϷD)dJAjp{pC]r|%Cc98DEVSbfd ?{g㶭b 7@{)Ie8a|=G=8cy<#2RozS$:<$)B 1փCc{B!a䩌&vN2Un#s5'Oa{N07{a*4î\vR)B1} eRY"-׹ B͍Ʉ 2+3F鰗oq6Hh4R:be+$s`%.`.k[x(dl|n ihh;sxݞo#S#O| M3cܻʎqn#.9'" )(Xd6\̆P+}C Æ>nHRQF :\\+ReƎ+TՀJJK8MW(Xdpr5OWDo]J;XW}ĕRH@0e PR U#4\\pkNq~nrqNF?4|un0eH>!|7~gb]mΠr6Ȍ2lDD Uc` >ܸ%i>i2FO0f?QmAd=ĴQZDdpre2>;TiB !d]`MȵĦ+T+yB 퀫qz5&4ક`ֹϮ\vjmGU+<Ԁcj9KW(XdprM2BVŎ+P)pC\1_q<Pa[Ǝ+T؀kc OW\PB+TEB!!pMW(WTpj WwG\I, ?S d+TJNpC\)# fQ.W J;Pel \= 4%XWC%pK cҽ[[e=W鬎_5*bqjcPCÐT}1{A VVŪeu[Ɍ,(6{ߪ7g)n`~/V["ou|Fu_M`)PB'rV>\ )׻wWm MBB% grB(Ulz;mmͳ7Jod9e&Y! L V`VˍPxUZRWPeY-Ea \>mJ>a:7͔.ЄgA-LJ>+ %Cu>najJh[s^d"*/3FK|`5cEJ7( ̞ʵ%$U|F,UGׂ!k\- fV%J{*I5 Ǐ]q[)Uc֎#UF:ydTGammh|w[7~ik_/_syikh=uԙo/T6-?[(ٝWm:6C !kNg2=TvT5~\Z:&-د|klY#6 PQn-eްѻ|lqЦ5pO#~<&G_/Avf_N6t3=e>KER+QHg*jM@LjrAVLgZAE2KA-WogrUV-~I`;;58WҎv*Yd-p\[P%HBB$+ky*bƎ+TH0?DpS J P UƖ>aՓJ,OW t+ke*q*c}?Ip%)W&+lӱ@&\Z!cTjUq(HW X'j%jG2z Uj=ઇҔ膱v֪Q6j͘XN޻M-(´fPd Q.Kf ;Q{iØ"*!\`+Ք+TUBrWIMXBB&+kH2Bf j;Pedc\= b%+IW(tR-vJ\W0$+I8Q4\\CR$z UW}ĕ"I@0P.KՎjEBR !41Jpn\( - ;Pf{+YGNvBKK4p=۔I2,Z:UvҍiYaĜk·&1'jW6L'MZ,dRKdH33Tr [R-RZŇM:kYj QP%Vѵ``&+e:B6e1ID+Ti쀫oWcBg8W+q|ryAZUMܴS*>WtձUOٍ  &x|K]-ifUp$% s PHƺBBJ!8RpU;Pɕ6\ڮ6̷R0#3Vp%%@m'WTpjWU/q%rUϤ`P?\,.Gc٨BFuׇ>}EWЉ9cU¶:>͞mɅ[t Ўo?eyv!vZo[l2T=B99M/+EDuu= tԕMۯ\(4/vqi ꔔɘ(mDk-/X^' ΆbmFɥ6:0L[+))']yiYy)=ya(.Gad`c12qQ&ta2 !RJYK/sY([JAJ g},> n>,c7b~jwlL2צ|}V˯\O׃*]\U4nAX:xz<վ¹2h"rlX,ގB۲/M}O xk/e{b\$)c?kW甬F٪y7S3ldQ;iAu!"(fKݧgS(aUzT9r55y rHL/T㯂LklcT %Ny#!t!4r죑_hшFPSSR)*SqÜmWYhxf! U*^ZyR|~.yx srpJF"( bl-.Ղ<î]޼6w"'MXMU/.STlfZ$WNŊ?P帧G+8Z\Sip+"Gqyp )Ĝډy.!W#Nr"SEI =kB9St0)!Y  /K><(68%n$y|txivٵHmo< M#&JD5^J}XВrrNZ*Ü '{ߝRdpxv JxZ{t~T`)/5TČb >RI 6\0%9@LNЂi3ny;Dz79k<^^)Wfa*errf#1iW'M2iҌ@8O"eo_kw41CG&ˡUv]t෽}@ u-[?~/ X0;2yG8RF<́Չ;g˾2*s~qF甏"p@KOXExt&zE2.$CRLXJƇPh\ UA༷TH25:Q:48-wVz=Sa1qvSirO#}CQh;ca|YTP m}eBd6J }$FN.Py&~;:nJa I"O8urSQDn}?SYS\;}JE^?N%wxXR!ʖ= hˈU*Ɇ(,IX$ьKn pPxgDٌhEmϘwEOւ]![Y'nWM@ry#+NW;ilHR. N$zj [KJ\9ǖȨN{h(SSC"2XLI8AQ]Tid,&vd,Uaa1 Ma,= gיwxP/x^VlTwAF̎؆ N"h\ι myj3Mr >#Nd: G7l9`{\頄W4vĄ*dYmJ.%v1'ڥc_ԶQ3؍a  uʅ.%* 2+5j vH$N&o$h!#3t̛$( #b~<$ ڹxXLx cOD,"#bƻ.$6$*"PܛS sFHDG@I ZƔΖj@8JMJSZqƃFDƣф4;MEE3Kg.%q_G\YLKEZi=.nxXT8ZQ!J^:AdIiF(!\Sbڱ/x@XhGq Eyeor띵Y[cx%&@g5(AЊW'Vy}%)\nhҰjua̹?t2飬5!1$íQ&OM^9 N 0YIޫ@*2@DQ`X{;hS'ӻ`1?FC9Ox5[byvxlfBgΦe!|##py&,q C:ȨGY;yA˷KA(O^$XC &ohژ}&敧I`O\>hcGs֝w!w52pCڎyۓtKsK78N pqrk[K2rJ0wDdPFA|3X4|l@o'7z鶆OLugaL0L?_/^<OMv0O ߪFѓ /΃Vˣ\ީ@ aɨ%32 )}ɂ?2z7~:&}1gw5JX/"&Խ,KC]ER݅,dӘF ;yn'ߐAN.58v>YYήޞ)͓13.p&8sHď:n%GjXgy]al)y \>OLjF9gu>YjU!G RP .ȍn ##eD`W;C{i=^_qz.`v@_";\5wM,Nضzt{4=\H1aW@e^XkŽuN^lsjy ϲYfLh_Z$H-sB!**増h`9 $Ѭ$/rYgNٳgCѯxNjbk x*q̀hl%A z250;l=tWj( ON9ڜZ疡Sܱgy:)gŸ<{lռ"fI̡ixe U$Se@OV32 of/oȵr[.bx[14-G{юGw}VGΩ:U{}qDQaIPQT h5[9ud|XY=| H#EN&76LeTyd&J',&Ύ0n'ܟԮ^dI¥G\OC8|X\ޞGtU2o?tH.w 4$': Pl&&phi<-y iSЀf!e9$x]-@]EM aZPD2V`(~dӊ<PMQ(T8٧.uh%(I[wM^h@8p%RDR _^ۏ4t䑎0.ᐦF9VR/SDgۺr_n+c&$1qN`i%W:>$֕R!/;y0F"+,)RޅSZR4bdR010QJA*%G*Y gx-NAY~|{%-+ZC\׆]\< bF EI%σXYޥBi1L/ 3K<{-{J<]F"LIte NbB"rpNSw`pnWA^W  8(Y)A3IVTsq .f)ozlY)FCyR SI=J0GJ$::}vfFC%*[[b|uKm͐fx{'qHFqQ5dǣU3El]nm6lkyt50}~O\:cNnJYS9Ϫz\Lǯ~}}|/'z˓/O0Q'Ͼvr):_̂UugI}1ߞwR9oҷŏ>烩KWW%]y%Q͡TAb CFWjRKu@DQh*]0szl.97ۅu羄b7"nVGӟDQQ‡%ɹc\P"hB+U^cϣ8~AR>J(?qhХW.tИ皨= sJL.ba~[AX"Rx8#8RBrr[ v(0>ԧ&r'i0O'I7'>*lmV/<)=|R')K3=0)$|<߻?a%n/MZWcNRJ+R2Bl1a-uVkę+IϓÊC=&]A eS^ U+.zwg߫Aw(<Ć&_EQyUT61zQ@󢚧E8幜Bų^UZ+@ٛv\fcq5,]Nɲ&w s$kjMpê xz3&%@h"=EiX\Ic{^N~;] ܫN#vO;ɒ_e8"5%#-5K2ˈN7 ix58rr_g'}!kR;n;hU#t6*U)wg4-|%Za+c6>:#ɭ׌[Gɳ1l #]`YWӘG"hleF͝QFGRUtd^ټW/`OYl3?l[hs,+E(A# ;k/?l2D΀MٞBBX0g)1=شRlN 0{#eP,Q^HhM@b0%!tK.L+&a7a'PcA/ ?4wM%o. Zb{ ZDC6M_ WcE.t[/ĘMi0wÅ&Kk, ~7(Jsv92h01yOQ{K 0 `flOoƂ9AcZ  sOc5C,*Nz^. r$Fe]V~(Ai ApP. *zg՘ 05e4zl@ iX_­h6*ĭ2ƙI"hW:歵kϷpC>*^ nc mvC.}YYggonwCȵM](ݽ{&͙RBqXs]RԗJ4n:u4oe8ծ[5/ UԼTr|H[5O7us4,lժn_֥^ 3MۆjOmjQw7w:WK[?n5us+ |s(b[zccR۞rыHݥ_TedTvNGeI{Ԇ*̓i"s6H#,ݐacc>heTFg;=] ,/w<]-.8B0\Qm8M7 5N( EyCRy XQ;7 qR"rroD厨K 5NJWZarif'{dž>3ϊʛRq:2IH\aÜL:Nn9Ta)}rY]S %5YQ(" l $s)263Ffd j0DȘMȘdl3cW,TPvXxXx6gvqIRp[*3p0}鈭(x9nRqd0DB%9 sTFKQސ@BR'_'4ItY$R:XVDN!rxT̽A\cW֙Q[w1صvLvJsciRZraBZi̽`(Rxxd 4d`(P9AQNq )E2֗~Tirfg3J}c$q_19ʋu}"v2Mx GE/R"̐P @ƚyĴ @7QzHL8cuʣA94a#γM-;}V$]=,FvcgSX^l(> fgd0j8,~kO*8a}ojb Dc#~ yPpQQx⨣Fn+(e488xwE,qYÁ]Jє[tnPBmSK;~,#R3/i",ДL(zge` z3C95,E\q 0<00aQ4ҲH q^aJ,(uHHIbӍ'oT.>__#mowdN;~|[][Pbs4vv-|GcO鏥]#RWJ$S8 Ho")H 0AkP7*xBדO9ՎC+u%=?(/o,ۿ(M7k7K?%9Цba'ަMK-Az8VFZ,}墷jvaKQxҔazSi [2Ld^Ju^ּ~ǯG՟ţ?|zěv`p\ғؙaJ7Q"N@ Z|u,=*pQ||>Yԃr ?;r 'ƓZg?뛟_~Wϫ%v"]Fu 06<}v<|#-8wF9cF#0##)A&ϭ:jꙔ)?-~ӲfUrh/I#$'ifvM̤BH ]Q"4*aAJ)-\B<qGr? *b>hYE8@XJ)dl<"tIdxty9_\$IÒ (c|\|q2>/NKP;Wѿav:^NnU?{܇2Fq<9i?Y@~LV?'S<*aa—rEU"d!y mqSeER_VnNKB- Zd0a>] U2AqfF~X1o9_z?!B*|5Җ@>j|#&tn _e*scj޽/.:>UNe<|/!K{+Q ÿMMjlJ_ޭ^ec +Z%cB6le=qͩP-Ҟx}i]mHg7C-˗;AZ˒ Jz{j(CQȦK&zUjקW/gz/T e^, s+ǻ:B1u(lQ/^-N[W@Cg_^}]J|t|oU}qʧ准jX7c?dg{zpz{oģ7krmx@g~o|9OKK?,}"I*_(DՇJ1ԬSQ ئ=̻ޱTM7zqq۪c.cwֹCΖTQTA_ʃk_}ÇNٳ ˝>:<}[z?_7 g}~n s,w?eKJ%MzuIfy7K=sٷYz{WF˾̳˒+x+(PaS%{Ve'<_h56crܴ>_e&jlG/}g};7%Agϯ/o.]-='m:l dqu8_r_]1.eVsjVS,znugkaH‘GW+OӃdyZX^T[T蔊 )U*5 [lYךpT8&[ƭۢ Ez}U"MYlIiSbH"6R(Qm cOyg)6-W/Vryn\7,|^~ckKoӉ39mq[ f_^d>zMd5^59oDEzmE 7:+N߬Kcu6Oލ :9 |C-7'8, E 6hD./})}e~8ꚸW}X6?D&[ʿpʃֽeb#1m궫|cq >%M6J׸2Y*D8xұZ lK 92o$O\\?ijGń;S E\vxmSD+⧓>EP􁻫=쵂[{kѢ[Q!{-ν8w}b:4f*fj]<jyDctv>+X )w2Sq`Ӛ{P^|sQ,i#.8OFEVƷ ܦK4$m dHU&8unEj}6˨ oyKhwe*KHC?RH*{62mc_Vi*OvLi7PO^Z fI1AU%&u+5I%eЂMU|=g~ҥ֌%)m]3BJ>YLuL.ZXP- u*Bؔ LcSV%$eʢrQXW>C{4ZB)z,GAuHJ>)f46**RT>|zrDcUKk޵]Ț,0JX=2`03Dg̈ "ЁF}|W.Ncql4e+uH+R4֧D!۪y{\&a9(U55+J9QJr-^4kaZ)*!w3DoԱisatc5H ݉2VM`rkn_GRX:bBZqHji fLaMV .AHj%&|IА)A`6ՖLpRT@@ РXOJ=_:gICR [AioeMX1jX䒳"acV`FY)dSp!X%(xs»$v$~`)] >ۄl X1%[0݊ #J9+N=0 /5`;0FQڍH"q(jl702x!OZF%vLs:؜B ր!-CHL%eOVe]}`NC!()ްl24 !WWwAQtt[T9 Q ލ)l8n%n8)Z uUTIXI2$#<,fЄ WW c c{T<\=7M' MnP<8a9p&J  `WFVnwL6B̀j@o]T? W c2o 1c $L Lh i%2Y,ޚrϩ., ;bG]`&Ca ʮBL\΁̐b]K@|׃5x&/2Y{S4Ù@pq0Xpq‚@:$w=7/:ɿU@HҦcW6;'2ZCܣ. ٗ5 N%x0OA]DܹI5< J1ZH;7&R6 |[4qȀgzLG %/;ҮG1A/E#fca2Dfh$%lB &(LO`*jw7 ޤh[֍rǻ` ئy>҇ǀ0^-LP8ҰLJ` G%ׁі棆A˅Ye ̬Ìc (pv> k*,(t^XK( {'2H&|Z!x^@&UÞ. Ks4xOy 0|IE@V/Vgx8 , xi`unT$ 8U٫XwFTRvj[&K^V3AItvaU0v߿n Ηw?Yw~S ,!VX(.lt13 R;QyH!p9Xo%L'g?jÓI>RZ3%W>Wݳ4fa08H5IYr⡒Ē($V*"2##"#\^ksaܯF`@R\Vp;Ńd F}R_ad2e"vay;gs iEq+2LF߮!qXo9u>cS'x#"wVUdr5k[Gލ3gh^:_Eœ _Ҹ<$Z:1;*o¢II_$үmW3HZJSS2ČResXY, L FB2Rb&QIDIt_ I73Zhë &H&l`u`?ߑzDg7K "bD "1AD "bD "1AD "bD "1AD "bD "1AD "bD "1AD "bD "1ocR" bB "HԾ#b1of2" ΡYkok>:P^ꏡ7DkMtȌWe64'\aD^E H L X* %qS>_x0:;ny-w],)$Xe5|ۤgܖii&J0e!jpSsn;qC4Ljd= ˊMVF"{\QIФVc%ফHX %%AHXTlIRGоဲs` fa4:c9'JerH-W4J6Nn'&21/8;,9Y LM㤍Q1#*8X aR?/ AlО,JPj:K躮M9hyλTL/]HyRb0޿f= ,ԡ /@<+*8{+R;@d"vtIF M&M/Vk" ^*.*iRMZ-.,ō_CSi5tӧ2pZħT}\F.saǟޥɼg[4ZO/.Qx:G.Uӌ&+X+a IOT=rػӦݽ%* ?_fW~/|q4 c\0g糾o"jß59 ;H!mOIB~EcXcd{W1 G,XVpuѓg/hSyF="V˧NR?:W_W"J+W= /g{y/O?~_?|O~|:OV@LSUl%ݦ<NnywjѵK_U?~0'3PU[U>KTIeK ca>ػ^Ek:za!܈߅w1k<.ܿ.,|dpkbh glxGkMnq%=!;JZJ/lK,5doQf9;WĬ1Fp۫GڥϟuPw 2m$PəVXE%&ߩE]LNog{J =Hy(su2I\3b3;>ے#}L_/syPO.BORbҰ ʘ!=.0Nq7R@hvNE 351$Tjq铇)4DiR2$J$L'NJV70hؔ`PMOd P1ݓ/k Yf`mG6yrL6VS7JI%b͉UBFX9U 0|Wr uB_,opNI$D2&j"i-HIq"0px_0S S+ XVVjg3K&c&&{9 %^.k PٯY^e"!mu?P(W BVSJ:BÁu1nt<Қ,ʹnQH6`N gc{THseHa.?y6Js5fG4ý7qG7ˌiv YKGôA!JD5_3%hcT7Δ FP6ps~ /1k+OF-iʬQ9RTdL8n+ ދe,V^ZVN ii,+w(i*.& mNVoY54_dɸ Yu9[YM m:ur2RRJH* HMQR/}}IdwFmw!ͳIqHa1V$#(c&2TEfw<ŨR+neUqw׏W2A1\RD}e2HW3W)JKVF,ۦQ[q[{*0^jƩ&`TZQcTAXXa3GyWKT!+& ,gCNɥdfo@w-(18-"Qe:S#u;˝^iTlgyL7K6 GM_/G6 +mJÿLq[u昍1pfd N6MjHʎW}$E%nR(vuW#S繸LSj3OۧA?W|K_j: $ֲ9U6얮]xO6vrhӱ{>ImsשjĀ%|ZBZWZup[:lsѬ:%,ٺ{wސz^jy>L G7Z.sne>uK k%4[ S%:Ly3WK+;n'ΑDoN"hdY\fn)~L 35e4jmC 1ǭsؗLj[*HI^D}$RɱQvN%K~ʎJ bȜ!K7c(6)P:f;VAet){>"&L#({8q0~sj NAWTa 1Ha=¼S& B84uQ)/t^Z`HjX O> "rroDe^پPYzYNu-ggudtGoE%&pQemH~f|f h~8r݉N#4!6)ɤCj؏%UXJؚ 0RhȪh-J!<T N9Am7:cdF/&3Tn͘M1Vɦ ͌}uʬ e  W.2h^Ml8Ai`:5"w̛.啒k'%(a٤`1Ge孕(^3 j-˃ڤB:`^en*w\lvVAYfg>H}ӐhlcOQ^k^#vxB42Ñv jU18 1qY@vkBFj #$$n8XG!1)GIr4aK;2klwQ[Ћslfɾzg֋׋^4-!AH)WZKlrsja 5Ja #) yCcчfǾdևd?}xѽA79ޏHQ2hɗyrzFgg\zK(ڒ9;=~qP8`v^\*ӬůI '0Ltm_mt@TycɓDjG0pd^ʊ\V8$XLA;IJR!Uxd!R&\װ10+@.eLD:3k&ΎNk\y1vUmJwA>9Ŭm9fITz{_R3/i,Pmr4JY5l73DYARD)5A4B4CA10aQ4ҲH q^aJs:$aXa鍤''*DN4OF‘5sȜ) ywC,uJw@%J&U<[l=KRFSHpAo)چw.o'3+vz鶆OTҝRǓtƓF^ϟճ%etJF 0 ]p>ʠ8hy23|y#jG,U}c 9Q <2-16xn Q{DzVTO~:*X%1mB݈6/ v, Pi7>"A:ne^ d|4$01X<.Qou1}?.] ᜢ8cmWubxJNU!./?y5 <꺮ڭohHOM`8k[,]qӲ>qiʷc.g[Yp1L]Cvfbꛬ7KnڣE\v٬4rm{9Jrd|$n.i}\7~9PQnI?Of IΧ4Y:/.VhgU'YrV5QCӧ{xr6Ysp;xWJm.AvPMުa:Y1=.Pҷ \l2/\s%\C6 ;:cG>Xtc{_c,`͕ngz~.ttFkfo;*'fz|<$~AvJ~ᲈQ(H_^A: 3Hw~``V_mﴝFMĝ!QSIbAmzLĴ&M!DjDz m7;yM*K> Ê[i,=.}sbWd|s{GJ*hS6 e9˂ҁؿ)̣ t񬬬ǒ!x)£UǜAlp(\ k^,'5hhtA Ue.)]O2de <$P, \Zi)h?kt@YgePGAQh}B@5&Sk=> (T/`,^h!ɼD)jY+|fJU8`1waʽ3ø)a1wn\Tc3E ՖgՈ|x3ɓiRgdk`y~~S_?5hUA*@ =U^jl,Mih}ت^2NHiaS+SmUtn!?]PGWBU"T\˂ 3=qu+ +Q?{BZ8K9@[s%i7^+k)˘2V5%a4UTs<&9NxQ><^It>txδwzțmOa(u~{P`q `m/'. &c0f 6Ke$"_jY"M=YClv6۲Cz.LA[A>r7 w J;ĬWa Gjd)Z{1[,} *{cְHG:94qa KAP6T9/Bŧ0*f.)6ɇIQ \98MFGO$@kf~mJafoiy> Ok$q+!}A**DM WIWn< *x4u}nr|ra(FOn%0x rn'.!w8x;i>-|1ú$)`JA k)jXg*Oƣ5z&EIG`cVhtu8zb煍 m/ꯪO=;yUUi/>hQI.'i|^<+xI[硂l)XlstX\_MOA/Of/G5|i9fmL/J,s2J`:~Tc,)| ʔV0bSP?۽BrS7R!8ÁD#Y`@6%p&4DĉD(o4]|ݳ8[tLG|G0Ў 䓞+6Nhm Qn> ]ؓ!sLMGGY>2%I Bp3@"/aN/A6e %;ݲN$w]C"yg9, y0+lq#xK#7HG { A9y!`QmGsTNݑEH"[R.k%)Wx(llD׵6xt}Yf Z 瞠C7 9j3]qHrNHf%2E.K+4,J1ek-̏t鉣c8"\ŖͥBzL2i1 gyC "yC1u]]%uQU3)qyS~u\yݙ7 [ɼKюY ɭ׌[GI@iTbuo*kc&PfTetۨQ1`p'E2Wp6q8 XǓx;lދcv}}|zNQ g,9(y-&7e>s[FɜF9Y`°SVFkc7<$"^o·zfx9ak؋J *)"C^8`ILbx sIA%B@"ap)L V%\ MYJy> \ Ggd6qvsmzDȑd8I =|[+mKF®􌛮waXR^[U8Fp 惊&Uz1sdڢh4[q­Rii0HD!|R !!Gʍ^ywH6qn@']mo7+]Z;!d7b*JFF߯3ˈFv;- {lvUbnv:̓lDSӌ|)Ʊ{K؞&UhDkgmk7(S( Õt3b~JLhcuQHNbBbdgTQZ_޺@iJgiM_=ױ IW ^IG}ѯmAGߟn~n ]z[3C`sn>M`.k/(WTDˀ Bd;'UJJ{ 3)\$Vx۫",)]H)RBX#ّɔCl252Jda K4>xg / ހAZɨ̲J*>}Ub|U5S5p1$/:-K^1st0+H1% a.&x7g5"+9mx>={>ΣGKS|"<7+RxY ,HeK @AĔ ]grA_DMM(ሊ9!4/aP#V&K NG1d=dШHmi_p%=%:FDKYR VY|"جmn)k_ g-ĠhGxPWj`S[C-eK1UeL{[c0\,[Sa~x?^^ia~]q&kkVIm̖vjpIH-֡NkVh?MNflhE_pa*u?;f!,`yϩ)O8,hv_JtpGSG?0d1K<,HcyG,(5|+*ry 7qDZKC^rutW0#ڇ9ooxo`r|0xZ&ݽp>E,jӟO{w{qz6~cXՓ+{2WuVwcW?Eayϊ4=m`źL'ˉ9pNU{]>^W3['w|}-oIԝsS>6Du"Nq˯Ye(/nrB\ YXNs02| 1Gʷ|j~>&CVŅqżTDV1ܴ %VQ&Vck* E %D/]) @Np{ҺtIQw9aS J@!AZR &/XSB@@>E+)Y<acFo&c;_F6ClpՐ29L~8_ͽ:Zu* _#:t˃ w8Λ9cr|d[l) ٱ;ꂀT2GvqֳWވDS7Kl5<[3NԄ`=(֫M=6X d,TƧ գB$ny*hF`1*FzU9tltl)^{ h,`-.drqw<<^~6?2’Kg E$P'g|(Z:vKW7bCcJfL>`v[c+kuwE`dQHI2~Wt"oSlr&TLVb\ߔr٣yd>8`o&u ѧ=d~O.'ݫKW?7~|8KP~Ӊa'OqTNKrճkyhL-3펚j^lBHKkDsSFVThRϱ|<Y:t-tgHU<R=Q >6?֌=݈`< 19gR*EKiM* )6J&0f"(&_ؒ#]1cL2@.*%+ХuFac ]ЛTZ0뛺zN3_̿;>9Ҽ>B xL@^O6&cJ66t6@eG0 "tɦeUF%O0/*w]^}L#p^$g PKe;¡F.$$RqU:C 6DV=[&=ٳ[̊aRDv1zx~z/@xuo;9 U6K;7 o* n:˳<ӿ{;hd'Bra?K1U$ut,Wy jYw=a2jS]I=znG?)59A"O_ ړRJ?ݛoo/sU]&;܂g||&%9(DNڮ3N>b Pq;i.9AG]n_tfdfozO8=Ë=hZ&7zpYrּF9gr#r#=%P Ci" A!4 DHC "I%CJڐ6 )iCJڐ6 )iCJ1ALcUhO2?]9ꉓ.O /ca@FlSA&թ 0PNj@"D s(&BEV]5dC&g.QFv.Knˏ"E2a2R'1 !@ H]Pƒbб\cl8$3do˲ 9pݼ}!ɒ);B,9vE aaXssй >Z\:ր2ULb%]WR-Ɗl8w+*֤a)4ƞi% Uk&'kVJZT3nMwz|tW]A\Ħ[Vugې=P | Ebe}m:@כ:{))ͦv>;A¾P dcUDzp`\̍ڭc][0X&hSm,NJ6ے xs}8.HgS{XceJ`a4Egͮc )Ȏ#( &,k =l6a/ sUø)|ihk,`WH|$*V\!tJ2"V dثUA)g)jjH!75Z*$L 9v&ș=ie^\P4͆sEGYPMu6ull`xk a _;V' fgKxcd}0AIB`.n=l&C37kv:;~3*79ޞ8O7d?Pwη?`sV&=އ dHz:߳~EO{oOTEtK$YBI.hc *>3\NeLIʑ,,LSAŸ9!-baQHFQMP "Bcpyb>&.Z3`qZ/.</6#m^y֭>^W~yG3>tY`Sœ<רdb"EzOb,/x0 ax`\%dV*#H2!@ PJ.I/$*kmvnHpx*%֒TʲrjlTw<$!ZhḓO!7݆a0 f Yǭ"㌑|޻[U+)nL|ھzD-:塦Wc5-pzZIW8%gIp~s<[l}Jj(EH5"EzDrCؘLb5G<p|;(z:IB,Q"QH"m^"U,f 9Q\0LKM`<먽rS=Z=? ̺oxvSzIlQBfmzv=P1-l !Wn6TGg.ҧ ."IAPjDj1`K⹏;DO0YK}WW H;OC3Nm!q`2MkxC4sP>btۿ.䙇]pNQNG`ų4WE];屙\e;>q[x5Ŵ9Tyu~/;yLl}'oLƃ[qM:3{yͲNEҺ~{Y$DzejCb|are]p֫] 4; s"_]$j^$^ʌ47.SG%Bzzc .ոn@%gUW0}'pS얛8W׹WS VGj}O@@֢a8(ޖdS`f( \2/\s%]C0'U?Rn|stjV ԥPAy`Euis޴^-ǭUT鬯5 PUk^si1obrqjEG5$:,i\1mX: ȶj4t-V"s&%Q3O9.y 1Xd)FJ1#I:KF+~meZ:~j <Z[iw)XN\ Ww"cs ?+Lnuwx57F% e"#:Kϊ-{iD㿊OʛOVF<|W PC!떛';.r[؋ ,rta5A>ȩ\MP2؆ ʶ;!9($)*HUFm͊ϊPmMY]6e}zY)Q|㲝>בu;[D}2ߦ|\[\(vF<ӫvic{L,c ^CD&p TQ lh:vF6rIȗ󞯠c G_v ጰ!L=)g>aaxtm1 &Cr1<l;kHE>ג)E 0;`lleB/V`˵G4FS]o3,"AC!fH`{8R$K \9xMi˹wch_v':F?mD󖝻邚3$ՠ<-F4[iR=JSdg*¤.6}oU*D sY04\`D$4*uUnm:ӱ߬{#XaBEuVFkCJ'mYz!BEztd)Nۻ$S{COz`Y6r?=F3r1Bt[4DC6-_ W(u B(pky+sݱ;ަ8fhD2_n}d-Yb7oJB~b- c ˙}?+l}͖dz; Y[˱N혥Έਐz͸u{䐄-VA`SXV9ịUh4BkM̨3hQ9b0XJBSEL99xqA-uqr\)J(ٌ+:d@RSN# %фLGJ{(D%](/0 E-fxTk06yBQlۥy6mEC&񾘇V֜Tit+":%qRﰐILH4 sIA%fw¥`0!0pZ`8Le)A@[/5^#Zƻj*M;8%mas/5x]8MEӦ71IN[U8Fp 惊&叧zqrrJmQ@4UV)4FƐ.>zT!$UHQ+NzUXs[)f8]],,E& J<DEbDͱg{!aAEfYv,CµoE^Ԏiv4s)@S14̰G@TxVL*`h1\KJ.t_+_  Yey\F 6'\ E"a(ae/ǖqA5B.gN3-QK%eg{ Ecӭu]_]oj<$ZlV 2nщ\QxnVI*eѝ^ڞYr;x?qmFiY N3VV!z#%: NlLƒOe2ix @9PI_|263%;"E CTD2'# *vJ!H&և) nˏt>~(; rf ZL^p vF:qH( <;*edwv-#8EP(M)c˃KƨQcS,;u.#mc(12X)yp(%^P GZ #u)[n#o}Ep T%=w Nz4IN%dSDb$ױY>JFƅo fRx>$Fg\?zxpV-ճ3R?tpC!=1nHs7{`DBFQL;~Uhg7^r7J^l]5A,҅PHn0tW̒ElPި!Re\Feqzgӗ_z~v?:})&O߿zW0RFԝIxw']F|}CKTi깩/fғEoTRőD: 2eHg3_?ڻ(={UM4hk@` usQ- %WnƇ _:v@]e:)Bޕ.>BKJs= t Vy=FS`9@)dJgzʑ_1ԍm&  `4xȒr',oQmɶ.)>r9:.?~UHdD%;qE6Y>߻u0q ] 8@ &.%C@d\X&">E)*i;Mvϫ5Ӊmgt?cä?w~ɾݴ薨uacZs`!;ꑦv}ρ[s%,9BFE1h)B qIAFOlA*2.Q*MyΚXͻ<xB3V4syk+]s^Yvsсp{`GW7U}h^Pd눆IӞ|3ȃƞ}Q"oJdFxy荧(is7(LA պW"@"&KٹZꙦ3p (-g==w .bW˄h,K1ER_)KTZ٢3@-1}QO&1^v))HRRbV䤸P[MUJ.Vgi UǔU=P09&f+>` Ly'-9IXXSʭUW (u.(}5q`сKxd"$th^j4_cZc}W] أž`[waO;zw܁IhgC8 &8WB gJG.(#t)Jm V>d]IE6Ԩj?ըZ)tsbхOEm_}8s&`Tt<On辺e~!Io$OPp_pI{+"V埩sjbll?Ӆ;o{ ۶_'tw5Gߦm͉Fzxց ї\yͮ~՘,Y:2khZʨs,J2.Ԛh͹.vpln%K` xE- . B|+iy{gТvΫۈ< 0+| n{/IuZܬα]~:/z>ZQ"{Vj"('ːFDB$mǩYLLI\&Ruh'Q 4'"’:dZ:¶8뚝$ty\W,g㹛8_~]`\㚬Q/7}Fv2탱ۮ]?U`GYX:k`gmd#*GWU\+WUJ \ι;&vE=VpUŵp,pEJWUJzzpe\ UWc*-UҚ"\ x9/,gf><竛ʿO_dRpW'A?\n&w h(|PwV %Poo=n wD0My<0]5X`J0MRg&aZ衂N0c4:xz2䤎fr<>}<4Jnr4~s~΢J# ΒZ^ռ8[tYj:cWXHw`zp)P2Y^,tydE Q`S._UNJu- *[lt)AL1EVNt)tQi:-Txq{2*`H?;a:[ޕbi/''JNLNz3yI`blGѮڣhwޣ]{ߠG[(]CZ[i8~??j-A!hˣrڙdv&peڭGkRknG~tȬF:F:QȲ4{d J˅SPxB)J) c< _6|l_IkZbEW1[yDٲ[.bSC1-Y mS +>\x݀gFgA#,QJN9ggvv(QQR:2&B"eqS$m#:Ue KL("De4E y I{ )D HQ1l&z9.K܎]\^ݒ, vc4d.-'gTUu"g$ƊD/廾3ueϗ! ?!3)YM+>*g_sZ{uťtnyP(pqN;+޴޵JWg4>>"PZ"Io ]J/l>sJcT*(($G ^+_BNU4)`.㧑չ{iٸvمNOބ9 pxhmQALD %ӊ'4I6#`KyFǛd` }>`mj=d> hcK播TT;ivך|᥼SʅDlqLgXP3#z**5iw/Y{H&(M+<焼HgT "wgMv0dz>3V6xܝz4Z۫Ǽ^ s5E=pPܨV޺WMdvIt-2`hy˭w%ȕBcܧ b |ᦚ!FHL[ , X)2c$ިsit;&V79z1\#EPrQn6+=w0D3=:/QЛM%!K~](d/7?oP1s499N:1Sh%[Zwx0&*EahSo_){9 hX@9 8j>Ghi~wz\KT^d!I?"< {, ۰n謏X4 8utVs@ڏrT>]+Ew2M8[R*!К*JHk)LVg)ۜ0HFfGv\6ӌm5BcNp4ʌDPN/nǛxOo`~0}\쿚پIxN6E|0Q5[P󤙠-mmkꍹ{>IB 6`t-dqEj £uk[g=b0͙AVڱ-jcc{2 D;NvIR!D%Z0^sWw HF鴫uaC)8,dbB)ȣYh8F>6qz'FmNCƮ b3EzDqOJ{ЙH@2*JI{Bσ&HBǶYj3h%+TZ3!t6dѤCo$Ŝ Wʪߊ/Ў@yIF)r)nwV 73DYARD)q{i-/1cx8EH"%yv*YP|*"bIOOT]8n";@^$7?ěv#O?ePǧ<;:~AI⮧O!##WJ* eѱռNKK`%57M٫MٰaަM ۾{p6/W[XjRߋUM-UYXh * U9RpR_=_^<:~CXʟy7/Ky36ym4JRV~ W:V'EY~6lf?53el=I҃zU<^bΫqkp߽y*j3C_f0 <[Fu}rb7/z`Sz#b1 #ɉJ\ihjϭ:jv:Rz2BWT B{Iu@H3ڼԳDYȎ.x"җ]$ qÚA^s-˻"*NTԠ>M: ਒wla4HFHҨpZ*p{ %GuDo'YxYK}W <$D˟L `vREOKya#ST&Xlh4ɹҷ$b::wvxԵl57<'Mq/;zEcaᮢj.F؆eW$ʓKSM)r3pOseSõtO,2ÆxX~].xﳣ2s3PD^ |qre׸ s"D@l5 Mr>5<#"dIZL~)m=]%4ujZM/n \&u܂j-+^Gdj}OـA֬a;(gKg( \3/,,5;]01qHw{g%܌&{0HHŭT#-2?,!(o>`@ {EJLm,l.~~ A[QUsgfLq:}i͆g٬?bh0)r0˭VųŒ~+&zO?z)lCog bi~W1it#4??ތ&OBk]&/.T<|\&7z荎_?yTy_̳&ާ^r,P\Yѧ`m2T㸴iT+t:(6Y7ʴ\MT}weFOaqm b ^~j펳L0~׺=T&hnͶ h9\ӝuuf[ǢZ&udi 34]xtku_& 0t 5/Į2 xv.]kوIIb̓(9GK"%2A,w1=HAsbovl;/x;C xSx_Pkc&RJ,#֖Z荔T :ed2K?ɤ%:_X on-1,T;P EcRɜ3,* JVt`R9Ay+[M鱲Y$,p[_e ZGqDa0 !QR$5ǖ#Q ˥rzv-!EJxσ,FG)"N8E8#mI8?} ?s!c{ +ʳgvB$9Y~&Yy548Icm`ָR"bd\A _\*4gdKdgl+Pq aIA)nE˽bLb/u^5@0XY1gP% D|IVT3p% NKwYy9 &e`,Nfy94隼R!\f#ϱ=l`r1_|@K)Ğۨ xwvyjJi&?S+SumIveNǔ4׽JOEK`4x̅~-s.` ^M#==7raph+Fu$Wt6 iOhq`0,yT&$AAt1wћ~fmU#`b6j|CC#i`إSa̒E|lX9 RU N_z8s9,޿<>=zQ}/??}rt|:}O`00 6VԝAxw']peTХs3_UJOWӏOYNٛcH"Y9j$ $2o!DJ.CVXhB]0|2`\"r˸, .|B: hDtk@7NQ%ɹcB EЄ VJEƞG# 1.#UJ(R{Pp$? 5l皨= sJL.ba}[AX"R#8RBrr<⻃Nv*dgkb|nxfO:y3v>L_LJ̇uyZ$Xv=zpexWE߇*TkxWb8 ԾoS6 .E߷/ ZAnPr> ޮTSr2d'RŕSrFz.ތ+2o%Z 60[emӅ;l毧 Bp!ͭq,#&H$Z$&! ] n0woirA{BBPNci|".ʍ"6b®FclVJHH!] Ee.lډױP>=HAhz܉>_I 5ٯ*3B Ͻ A;D8#͑_,r+XaQr%Ybm|+[ӽ?`^Bd8"%#-5KtȈN54Fa ""뻍O rH# be}0z)#"bFQ4`h[ʘtbgl36 w ;2d7J5Ujr>|/IXSbL Ix߷z8)!xW L%U"vs4Ob>vדwK!!1H&R"J *#FA{h<Ю"I hX'jƆ+P(I!F RѶɶYHPEw46Y,G94S Nq1ʹJ&bI3HoIrَwsO(gl9!NJ 0Cʈ`4K@R ddld+%{x-σ>N kc'; $Rr] Tb6Tcms;rVF_IRvadkdw w:i&$e\ݖao{zEx^C.a^B, Kg\B26ի!W(^_^ݼ| ?R@d#v_"&~\Jb40QKmZ.=WM _ޥJPŶӴG*VkZWkZvyѷX*0AT|6z-YivQAŘ$9\fڵ卞nDKNN!\!T ([-"x[$g7$PLtSwz]ʱwRtׯխKgizAFFk!ZHK&chp9r)qfRK&B| Eęr )Iؽ#M9KǒNzGri&`C5Fh]Zt3>#jL ®et#ٮL("UDrdVJzZEQ*~i3h7pҥC+uB*9tp=•=L{W\7pWJW Cʒ|Ʌ)SrYg"=.^.;/,6}a ׂcQ"^)b4L|hQR|ɢpL2vȄbZkWo>o.)٧`?Fp&W}LyF*eL0 B.'toڋ72ߟx5AM7.+::AwH蓩 C9]=Ƽ@/P oW(E;'XU],rDUc1ػjG[5?Q/Z.[qXc/N<+Vxay Rlj;W&spJ^kfE4 % »K^G 璟R KN^/q:6#îp~M*.1R <St+>Vh6g_"hai)<1CEpy`N-Ws쑉 13joΙZIĮؙJޙߣs-&ӏ~xkfF524KroFJŬS׼bQi~P<{?yB>:.kn h墼6=.W"mmm\U 4>9oä@+n+nvib~RܘWp˞1l3%-[$"ܜ*s|7⢃M▖X%ju.A 0q9LV?N'@ "hFB,h?_ *SX_/oIڮ 5YkKjI;Nzf[5CMMQGZgraoN3juLѝdt5;cսT?2\ݏ\pu/jyrj'JDWzJmxZU:%IqaG̖X|NQeL5|T!1NN? @Kb,' ~G,@sdK%yl;S-bcWf*UҜ?yAyrKg$aT(ωpc:HlZrUns4w1܃7VKh"귃98a`ʖrdgoËY]Jy/rZ7E͆?9l;q S41гuoXr-`=mCs%!yRYn2'|kgJF J %&2` ܙ!LD$4jT*B뭝\o79 yk1Ҽ&}#قཾL=̆FҾ+ǶZn ׿+\_3\pd{tk85ǩZEw85Siw= xua_Mq #x/CԧK#+3)Ă33H;q\uCNXRIRLC!A-GpJ:(}˂95f3 lUb" g>rMn;^8V=N[tŰ%Z捺,ȓ8pK'ׇV-x oOQyu eIJ& Rhjkʓ NJkTHlTr94\RBrd028e=k` 3%*I%&5rV#c{JkXgle,T>()s\f͸ȴrY]{u'qßv(m?'O5NbPC9|4h0^EPUӜ)좥5&2"0W*vOf5Glw6T^3A%6`$jYx>mՈ톃db jiMڝ{hrЁ L-)Axo6JH oygӠFxDarZP@ 5C hE{c$^ȴ8"Fօ$BvQY2Fj5g#: Zd2jm LҠmd) QB:ixY蠧z:yjØ$MRI(jƊD(2:0QˑR FK;?lgU68Ev) X8L-4.()nH+!nU&_>~g_ME/'sb<5WrEuP]tljѧJŕ5 2&ihS]Xc؇Cqv |; 07j#}C-SV6r#I+za|q# 4h_adPpR߼}˛>dhz?1o2N뼶/kKn"g';"wrRxWC{qRТx|OvRzxU\vm|ϦYX<'ך~wo߾޼TӦ{0{Kѝ o]ZfGV_^ZzVJ0 D) w9&gI&h/:գ*@uO^odv]# %[deRϭ&;"[2,uCd(Qqa|*H# I)o ʳWE^'U길d.< sX؀OcpKW=ߎjU Pe:;x3{eReX|{I1$UEHjBI؛?'Ǣ7<Ǔ~4xqeкY5>5.uhڦc_ k_q$yA3j%1HK%[p37QM)][6ZS.ޣum>u59WiOPʷ8VLtW)&):+i^Fs`MubmYV ~8l7>yo=_Fn.&[BŻW?]IjdUMO:^8U!‡$uIF}aa0?~ o4Σ}&ݵ^>C,=,U͞i՝ZIWƣΟrTD`|<5j.LyeNƈ;o?|$y܉ĘuݛiF _B0kͥ x[UsKk-0pPM&pGiyQ;`7"99D]>@pa.4L,&8FG ٳaF\Ο;PΥޞ;&덶乘pOt{=uW>Z@u{9if&6x3UT3Uqw jLue;m mXJN}-Au5+gS+[Lt j)HTBk%-% R,%D9N(ː #fihvb侺56\ L_sic`H+%LJHU33 B>w_E4ʤm@8NjW- }cwOSn3ʞOxGPVb |{7Oo^7z3^c8%bV&[j(8`а|=s: l0џSDȑpp|'! dy^!)& ,%CP$Kb᪠༷T Z:Q"uH8-wYDN{ZƇUgG?z/0z]i_:+ahDe얬Ux =ƪUve D{*j࿐*)M9"KN]T[M"fmk~mԼxiO\6}rF\^sxl7򽪅!n2>gx}x$ z-jAx>Z!s0Hؗ%28:2yU`)U: gf85c2(pP@\#l+qg011x}ddZ)*z^q2=!L3T̲dEe%y(A[gΞP\98#+!i=<$b^; >IoY俭wFSkl {Q@-#0[Z5 O@ȅFP.rDpA+)V$Q>piPH ` MƵڌ\$2VYxV֝\pYYM% p8\e8T$,e8\5AS8M0.F3s^SDj!IY#@g3BsVQ\)YTH&1*FbdA'~LCoNOϿ{Z\^ML-L`\¸POCys<F?ϳ fwcKs=%wUK]'}gLf(]g )LrR+)gn4[sw6$x{75"] d_vեQYx f""s)U吝}ER:WN$u+a˾h>3,`} ;E jٳ, l_hX3&׊+\B`x1Re}gԝ*y63\suMyd|U_x5>e*sMj[ngm;g(®a2._]޴qoB$ז$n#?|gy1B_Q؂J5 =ٽ7=kKedsC+uͳZ:Ќ\mr֓_-+汜a2U'~2۝\awǻ^~͛ۗݛ^Pf.^}훗[` ! yp~~PC࣢k^N]nN[}/{eo-2ʫ(\/vb9jw ukD Mp# Am;toŅL B b:w]̅89C%zNėN^FI bkHȄ5$oQ&bD*ebG#hNG84KQ?HJLPC%g[axރsjB| &x4D橌Fc1#Ͼ;7졳7 \uaz&N` 8իYPHu݆BqvHuBYx0[HМTkN!~8ď  d Vzx,x>DςC-e,l*N:X'Q?p!LK7F7T*Ӿ0~MdAtaxBGOh~xtsɫ Q҉`UpyV {DD ~lH @H$)˿ ⮈tTYƩY^IVIး}@\vx[aAC8pְ' -')I.[mp/ϥ\K* !TfN8Rt+ .&DO>GZ崳!muYN\'QurvF ]g-:k56n{rmbvsYrF}@|dQ { GIrkUn~~>M|'/?9T8\y N]^ϛ֡۶Gf\흥g_wՑ46@Fp%5JzkTl{üB9ӴQk3(^_fSJEj~|0ʅFƞhQ,_K*sSDV7ʣygh@s3)+CZ'?Lכ}Jw2QuX_zQm03P ,ӥBKrPLX44sʼL?X&#+W(%Ŋ@A! UVkҗU&!+ν!l8ZO|"Xc&hPkǍ3~}"i9:݋MS<ʕ`1mf IeM9]o5N6w{8_JH޾N&7*%cGIR3TdF FU"-0]L;;Wu4rE"hMrE6;Urz=bLX:VOTjխ\jGhAȕn!Wzc1;\G2\-gQ 1UkN/WxpEV(-䪇r%@֛G#W+y,rEJ.WD`J6G$WlF7hUQlg+5JBU̮f:ŻrV5"w}qG?O8]o u[|,R sS-*He\,2-Mf߽Jtۍ;6qڑs\XLq2#&s-/],}ކb+mTjR'K8o)t&o_MvguX:l]tTW+7Ҫ RJe\ܥW7~uWu%e3&J @2ycx,d_y^_fS٭L=ޯD,rEGu[|Ǫ5 5GC4L꫿Z%QN#zA7yеTx-:(&U o$kj̭͚nM{ɔ]Dgj8Gz=&pZtZ9ՁQwN:W_-><,m#;_yօQ[!Z tٌz[G:c_^z|j8⸊i  D' h =:IvLctRe gѓgO?Hґ!I=ϸSy6rPb(8g/?]^K4 < V(Ly~d*ݵNiL˪Qa< c^Xwn+/^MA2-Qp;tl{]Pݏ5߮Sk]}w8k>L=u6'G4^It#;;e{{|ٱ ;Ny;lpZ'iputTCܶm9 rsdehkhv ,љs{ցCRf^]g.wvq-dj]-ZE]CW. 6&r^rZQ( ijPpE4ʉVCD8a'jw 3cƊNf}'m(7 bi,FlF΂"JgLKiUopLR2=Yd2}Fv |LW p\W j$COl꿚'& lJ4ͥ~xuJ \#Ӕ๦:Xlb#_̵̀`%\O RAu0ƿmL`_bm/Ԫ1# QT!JTa"e]n6z;"6SC3=m=(zJlPOQ{CxJ>(wJ%K`C%|*ӣ;˵Mtqg2d*و AA4rE]lGkurEx W+! &"WčF(CK;ճȕ[\!""\b+ǮR^ʕFHXhpE7.WD W=+ʻB`M4rE2"Z Qa0G2`D? n*ȄB9jb+.EL*MmkM,*MΆHT*m`"+L(䪇rQMGE3!_AbV1~0u~%nǫc[jƞzi\,,-rulՃFCDrhp>Q*1Uk%HX3\ng=5Z.WH9U?JhkHplEkB+vʕ4\!s,=\.&vE* Q`g+e1 ؈hpE6(䪇r͝Ms`qD] ^R rG2`Y#vb&VƠFsjj^~v%*&eTP =2wUuFV:#JЪewj9\-*YAIXL7u($5фZ JZViΏ5i+!Z\5\PƴMZ\! KQaKˑ+w`ջs[ ;v=heG:Q+B WV=8:+"B`ae4rY$WDmrEq9ճeu2"2a+•6"ڮv4zʕdHX3$\`h]䪇r:HXhp5E֨(C\G43B`ĮWA,rE&P;Q!vG2Qg0H"WD]\RnX:)tR\A@P/±&퀍F2J-w4Q6T?*pS*5""Z)% rb䊳êL/o ٵujGyX!;յ\S%0\2j\UQ:;UscHuN%j\q\Pac+F#W"WDVɖz(WRH$O+E#WE,rE*Q>UJ tLv࢑+фډV]a\PT$WAD#W++5:tBJ摒rE8{bcFU xǗ[J#nҮ`qd_Y@ZƊHR*ц %B Z3\!0x"kX,rEo ](W xFV+$Wp`È)+?%zj;\Vw%̻r\[!tRg \mW[QZ>UkɄH]!.0\޻"J\ ;\ mgѪ+\Qab+]'D)+ɍ˙g8䮂ʭ™,^JV\c˪4eNXƴmP ٲ*LWV@Ά@Ni`,!K&+3΁5UK*r JAiLJX=Sw:b9]=3=-#FpEj{roHTFi(*.&qU16ZSFi -qQYܿOd< l<@ . ]dm{?S+-}–xDU̪1>뒼>TmZms}vy_ogV[_hkԃ_{= /)> -}ln1N~&7X.ؠanxw [n+b@|v2& i4dVZI#[|:,ٺ~euw["|,֯N >MHrUCޛÛՏzgF:l^q+N{`Ч63N{tŵ~ T=S-~-yxFJ}>t\lR*A4sJIrVpٯڒHȆO8OL' )tb ;y`6>gkl\w5ltWjm0_V(.8yۢ4'#z0z /!F%Zε6P$ɢN&ci 5Z:5=/rw6DZqmO-eG"MfyV(cr-(Beˢ]e5]mU lv۰+{>`ip5!$|0jDm4^lPI:0*I;/NTZ3x'|hക{9(V~9mxC YHb%`0B..5l &pf.^$cF$ۿYϲD#R" ƥR#A8"DɫRH2]%8j)@Х$dixp6gMR&0E^ i~ gkϥť~?<3H'ߓ6zzg8>7*,8F:IzU&cmTar MLz&œА[o~߻LVs T[:1˓K~rl8dqd2C͟-#hY@I2_fr7毴 a"N2¾_F1/FgGڻ kN7oNz}7%^`rjEC.45_ycd9R}omNֽY_sϳ'?6?x n\ g糵UzmW EՆ?]Ճ|xϚm#jmI#]lFnlfyGq8j(ZU\ٿOwMnvQ%=bF2Z}،:t5ԁ mMe _WxD,7cYN-:g{yNO݇~>}x?o?;:b"po~>kpk}Ӗ70ЍOyJta?-:nrR 3d{y2"ͻغ UX,z".qks[qI"7G' q7 .\MAaUة# zud-EI>$p%gL4O iB %D6,/+T)ރȤDe8]X>ak|1A2^%c :.iLj\8}|OR%U !z9vZqLpzuy,!c Fvt ܀' +~ ,!1cΦ&օi Ur{P/5}q|1Îs`ȁ!ö_e{%D{fV~0>qhă`U22E-E!.9TRaFf:}*$4H\Gt*fX8B)}l'F~t@:"į[3nFM 5lڜ>UMҤ ^}\|ߟTOKB?%jɚ=3u\I~[IBJ7]c '6Ο/J; Bm+*+;tD&>:_N0к"փ9oeg@`^1 <Q2^gG:HtH džD O93`ĿOIp܀0If*x(!g SrHCKH*f^(E [tZEDr{YU-CnB )8μ IXZGX-GO6 <BVn \@:')f|b4dC r;:@?t1y5+tq;2@ }wBߝlMw{;>BԢp"\'0eETº,C-C5%aL,[ER˨ZӦh,J.(-mD-B3'mZFs|!c63ܧ9LvQja۴Bf\Mgm2ﱡmAp4tRV ,&e9hS1אIh7"nIo4%b†PiLD$}A TJb87hZE-=[zП#0XCh[tFܡWti^*lujwx ܴ:DA4.FgwdIɀ #/":W@Dd1)u (wX1ca;X;E܉sW =e!V(H 7xg5v<"x<] A0Nˏ&aQc{*օ<Ӌ%qQl6ĩ(E+oK2bC"DF(kif!)g1E{IIkk(՘vykg%;>/"\cSV>^ig8zvLYviۥ>Hot']?^d4)5|)̛uIv?ڳXSP{ЅVX41"C)/cp{c={gA;Sz~@ 2)!*k A!ӥ#-5*Tɝ?\2ӪJi/'\o!>gkϏ51_UzOVڵprڼY:U?}6d62H8.ϼUI^c!8U ȓrw@ЧJ5Gkլ^`t\xej'KQ]9)K+JL: BJWڎVc0֘d_u5Hz494c5:i6_!;#RZ7Ê3%% R@0&޸[H`u+(y"R=r3Tk/tc P ъlL#(fJu=w&$89"U)!'=GMzfBDaxFͮVo.;:ǣbj1(nM&,`KjNj}ՇKP욦v7ޅgUU쳩͠G^UR+WթT*UyUu}Հbgn$!8zh5! s&jpɡw&H xdUMs\B5K&$K1:D@H}t~v9AAhBb+B,Dh]eu5r䞝/&tqm):BS2˅a<,s.E&_'S`ؽl>*;{Kû]N>J%?;ȝMF#6uv%Tmw׵׏~rBdTtԚQf5=-pe6A?^ͦCjɨ%ûݵznr@+-wz!Owg\=Oh׈{=3azܚYVܚ/Z{)ozǴ]vqth[:SbDyn.OP^!TF4bf- /7\'Sb9̴ JEWyt_SGr8/:łEb]Jpo9Gbґ> Vbo9HV/~5DQ&sڃ)*1$t bs`M9نvmk %(!)^ѿ2#1(ﵰwtܧ5r3*U',M=sk,wm ec!@Hbm pOI1Er9%yP$>D5%bO骚SUd, Bj%Wh:` m݆f7w@>ԓQy=N#%i"C+lSIհKj sخ)(GkQJ%`ME @Op)2hK#32x5)DȘM gUaa-X({,).͙%x 9:tP-AiI5[FQ; rz`,j M0Z{lB$XQ-eDym( V 3 j-lR!`V0/2}RGt`Z*w\Gl;9soP;uf=j0:Ly9Lj14sZ)-0 ! qs/X$t'+F.`1Y1 4CvLDN@8Fc}>e?SAkiH/m~lEĶ{D\2Mx GE(^PyfQ dXa&D`tިV0BB2g($^2XobA(iP΁%MFbR̈M r ?ɓDjG0pd^dE@]V8$XLA;8g_a[oJ%[:Oq<ڭYYz{zݟTJ4),Pmr4JYj$؂qN:d KG B4{&,FZ)!+̵Sy΂XN%X#"VX,qz# 2 ѧ͓phzk.dNS; ~t[R0aXM`D)XCB~֧*R)+%ac'қȥb5G<3L}=KG/ yLFXE!ym(@Bpssيgd[v 0!)2D TAJe%՞*!Ʈ7N+}xCpߊrXݡ]?O| y8icOˆ .,y7(͙?&yҠM}aaMynwoO}NξKZj^^MK™eJ R+2Tdehv^ j9M<響x݋??sqdppo/m.L4HJ'':<h6Ao)w.o'3-6z`]Ãըq!u4I5hrJW}o^40GQՃ^g׃'=/^H Ν1zΘƒDr. ȴHJ4)skګC&gzgb]oMK u#,{Į˲ pjIjR~0tXs#ke(`HE c0JdyURQ;lja|d紕0]nAgWќ Aq6Lf20F㢌hpxos[OfEͪ3!|8$t\ty\9P|nIҰ >6x?8Bq2t4y3W+ֵct\7  MWmAmк {mC6[$M/PIgVjBicsE1}"CΔ)E#Tʣ(K}=WQ-ik4~>zX Tn1 vn?\`۴t&mFlr/?gմ*٬?bh0)J0+Vj"HKLFӑ fO?YupjƿUӴz7?ڥڰ?M~<ل/w]Ѩt~\ON6ܴ(/q~NǓW5%iwx2wӜ gli0kNfZfSuc\(*%\[ԹY>aWE{:ëB˨lv)qsb㗣u{j(UTDh_ eϊ@m<ޱ);w&b vM 0|jzmc.jmTva8YgoxQ5Ӥ-VL9߰CōN^͜ S{UmJU<]\Գ9+=9Ĩ'Qr 3g gS "8Tˉ ls3:8j52)%3 tl/%͍V<^Xy < ?OpO8eOP=I5/? Z?$C懺Dr/ EIɢ56CFvAywU{Q(yjG U t,^h!ɼD)2Z+25t4D{R%,z̝vX-"ro0CJf/h̝?-8DոOL-tOOǝ6}W>&M U.V,~h͡JS+ JLޓ*lWZge*hO[sqq%ㄔ62VVEݨ|"]i{WU"[KiAf1{㲝>)Os3V+k٣1цۖ[9B/*ZYwvKAJ]s% 7^+k)˘buʣ&p TQ lh:vFz$yygm l7NJM{2a4F\L2! fOv$ 옷&>=PFC >{?j>zG~2ջCq~g>9\W#Xl$v$J\Ln'-vĨO DAt:JRZg ̶;VF4ǭ j8ha'IX[mE}yfÂjy6,;54:(^5h:?hϼSw7Twh2ߧgaՀܠՋ!M&19f I8M.B&O˓[?zXL4=zg|eflozG:}촽_4}"qvQ³F9RP[ݠT2<E<X2tie$THbaJxscql1\X_3q^l7*>a1v"IA>Ԅ'Ex5 r}pB}l~~>0̒/θtyIw9i+CORDt\817{8s/Ā[]}?v5fi]êI!ǐ z\̔48U%ARk=mW,6,6w,6G,6,6iUM`dtKDa$aE9em4y!!ټU@)Pʍ"VF@"YT[R.k%#R$b]@l]. > ŠhL;m OcOAAAc_-*yO*&W=W7{Y>g=g쳞}ֳzY>{&"93ke5:Pr.`9VN'A?cgmI %ȗZrV:Afxo||f.N. "y*w1F:Րđ Y2^ wE%K܇[r.ͳ/WA<;6IJ{T*M؝JP^{4?6*"؉٥BB,ZZJL)0">}JF{s_νpaSq6,Q^HhwH)6`|,! $HLsvf|yyًƧw?r1B[4DC6M_ W(u B(pZ̴x[ ʴߦ~Ͱ$'|KUv΂`\}7i@m?$R%`e{r>`+ X&H`AvEk~G3p1J5/Uzi.;{.DQ ؋$F^8a F빧10\YT&8R(wʑuYY AHRV~u0k5f,`2b=6MM,-llHk'{dn[b8/쮷2zfʌM{jKυtH_SxV\g2emvM.yףӞ „\)ZRHOt$m}TĀ.6uWՙe:c).+e׎|Jspzkz>LFugY7A^?g[v%}EvS]՝/V漾BЦ;{ƿ߇}] :1ur]J)!(y0$dŕb] %Ǹw%~D¥;v%L3f|@sxֹ{K&- $VG/"VE0n%K~#t&@L#1Daa % T[,heTFg{cnާ WR[[]p"x`:pnAj2Qx- yJy  I X O>RQH$*Ű[)rj6q/p̨rVt\vC l#KVb+ӑ4!6)ɤCj؏%UXJؚ 0Rh_763tGEݍwטl2Y$c!d~֣JB' ݷKV:,ǩFF7Qt#WaCDTh/}_8͜qWk^*|y 6J_ޫ{} ݆l}{UxS7%ŋr<`E|t{V0]x@,8ywlM7Ѿ~0 љSNZLcRNj?\\ܡHRw![ jcVꌘb'|/_X;p}Y{ֿ9  (z?x t/ͥ(VVܨQvKq:Z-&x!;CAAOrvT{re&E76\RksҼӓz4شXimp/}oTѻDq3oM|Ξԑ^Ft 4FҪG0F0BڃIΔCT1#g$\ JLh1 эùh#Hh_uV5$$Wg2=6`z<5xqY=w q{>)UhDU𥍑xH\#91~gR)9Gx! wl2=+gEl"։V)ːh`+%s-!Z)Krrf!.=|\4jTn\E׊>8 3Dmm3|NfZ<茕WpW#9C6 nɱgk;ŖTHHICbʒ28a&V!0P;<1%CPzzl݅$ehXēO͋E|"Ӹ0.(sb "Ĭ\9‹L,mXڲBo/Bn٥U;3u0ee~(fR4"<,:{4;h ǁ|8RpnD_ ƦML%ţɫ2ı6bknM"u*̘CX8)3Q~/@1'i@;`k 2BwZber@&t=h bLli쌃n4\F{n`B]6BnU{J]jC0`Ɛ)m1߆03 2<,:TA6(hɖ)pVzцGP܁m@یLt#;ӡ%᫈`:9m;d!.'1O_6ehߟ{"tS*6"`UDЈ !߁]Z.5wA/hC$*Ò#tsM@F# P P}Q{p,*ȭd.@br@HA[oBIp`KƒT @@kNkyT < 9t.WAV#\ecm\g*H̴Bֈс%P?د D;M"\[pi* Ka|p-)bl+'$7Oָ&!OtX#S[ٕP4@IQmGRk4lЩf:pfcXPLSn%lg {/4,A  F벱~1{nzt es?iIܨʤ,= D=0x/<8z1^zfMhޚ_o ]Ԁ4^[f-tlݐq^,R( n d̆ <>.hmҌ<&ÁvCPB^"(ڐ͡@ndDho6?%^z m %t 9Y:&G`#z[ P}sh%' [WHԺյtrwp=. 9B|?u7Հ C '*Ug(SY$ۓv Q*FY#Tu@ 4|SjcFlzss^2S55nML^u]L:Bd2JMpUS pkaܲ[cPiRv:`jщB@P}4 G:8:@i S5],ol9(zt^lOue;ƇJd;U{:o>A, &n\n=/0yX)8gK+U<&"I ;VÀY6\EP$2tAbI פ Z`4r>tUIgW<<Z D9NUq|QꥅNu-n===[R JvM5:1\!.uڭa?~־C} C#@p PL_okuE<^V+>d>dCxTTs}oA'}AA#Zg>ߟruV6| ۟s5\\u-IZ8l=`1,.ج~8:zyEhx2~-Z^{y)m0w72*FP:XBY%B/dC\=f={?EtǟI]:Y^Q] uۣGhώdwFwKߗEogLq>N)ڤr-`x2Nb|1z? [A 7>~`X(\ix%$Fzj6w=E@ jMv|s`p"0L3n`8fo-]L^ov7mc-Sf/pK|>'߯/w&|m*8+Û[JQ[P΍Es0|j}ӟV_}u]~qtUh~d׃˿fᗀ_^ ϣ"+|C/0V7QWhIŭ-^ڽ/ŝxR|sy66۰Ya 6K6l^a7-vݨl/>}g >?> mGgt ?O^K.n,o\>j\9^]^ʮV˕?}\=ǫNW|7yz|Zo_|OOv!4'*b@[-֗A<~s:wߙÚCh(B΅N0yzN $%j$Qɀmq=5=]Uu#\GW{ wy׹l|e*N?fS΍Oi9z:ݦfd0u׫wC_6V+u/Ylt~Ӎwsvooyom>5Aמb(G bXUN#ʋAFMu>4:qPIvUewVټzІqUYsٴۣ}d:lMEiZ7CO?tdŬ?xE.XDEC)kJExW &0;> o(w_O_I޳ߧ>lGn?&ĽG웃f_jTxq8,uӯ)L~7W_Ux2/ۜ·Oᇿ8m1] 3u3oۻ ׽4>3 G/Zun[|p3hBl]^fN>a3ܸKsS.|*e >4w7] {oJ6Y]9rW eG!N3YMqddƋu8ȕ}/@޽f``6^>+;X#Q.;47y=t Gߚ`4|f)ӊjZmfA6=fc\OS*fWnhhnXunf!ph#R]#AMwβ{q~aNHAIRhZJIX%D)rΤK^`9Y$aDx we4ȖZgz O {f"źc[|#LL1b ,ykx>D$\-qp%Fʋb&}!mmnH2ldڅ{V3#Z{Q̰IHJDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDoVb̊Uba1%]l@XsN)LXE "/̒wIT9;R]F wH I*~">%_{K" D !1CMmXVAlÉl=ښ8xT>A-'W) ryX.s!Q8^W1|f=SA1}-{UQ"~ i+ž=M\c^p|'I3K;[=MJn>VJY2E8HRU)n83{+싓>]c >xɃk4izp\ݶƢ{(4Ԍ$\scw亡\ r\G:"uD#r\G:"uD#r\G:"uD#r\G:"uD#r\G:"uD#r\G:"ȴ\oukLxՅ<. _F$pa0o&潓|' {GYVhԊ;!u9TWe QJTi^ 1`ENDp YM%"iyޚ8W=3;B| \`ǃQgr巨˻ϗf]5UeSe -̑f|qJ8TF{ ~"̹` mD[TyTo*>)PLnﬥFe";*g4 GҰ.JnRNP,ϡ,.>Ҽ<Y.X;Qj%Fr:*ƃNU@jOpjx!׊w㺓\"`BG.I]un&Hgi Lj.a/7 ~=jK͓vj5ypeM{>Cɳ.<%z^+05T(1kr0%z@ZDHy %z-ial||/Ӄ(,,!A)SFk<=8Ӄb!}z,o*K>^"-僁ɗ*J/}iN5>>wz0[M12n{N6GXM.o#N[&$3f,Ϊ&uOʺOdeT.ErG%a%5+-q>HX #'$D*ZJRguvVWz0G ap`F3s^&*Zr4mTviUP; .(Q1#*8Yr<:Ϥ:>/ꢎ?fXvus0s$"M:*F4JtR6j§ܨzcο/WO +{ bxxQSq8¢=Ŝ1dP&B9aGCrt3)GnGaVpik(KB|nJU3GΝgjZw4~T?W.\6.gLѹ0Ws_r;ۅ Lד_ BV݉wrV݆\@(bLAFQ;߻x~Uw]yWFuY-X: i#Uq>Bz WWv5yȃu,wjؚN̉_κr't0ο9oޟ㻏ߞ9e߾]`&)pRd~~)cwÊ[/yn]yY0[YNW'rɔ]?W@r{Wu;p#6 Am[ܗ6y}vb!v3V%TJ6޴GkMrc{Ο#p9pY/iԝGv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dBv-dzZ`[[qHE|L,n?"YZ+/X0,"MdT=A!0l(-T:,3'\P4M^#dWM- ᮭLe-k.l[HV>.7zRM'?ٲZ%\HRJ r Vzx,xϵn#N\H2m[GXZCLk"b5q6#/H{\3hcau vylϋ\WIm)^8Skta\O珝']u\7D`.Flg@*Xi7GINZ'8iI#Nq҈F4'8iI#Nq҈F4'8iI#Nq҈F4'8iI#Nq҈F4'8iI#Nq҈F4'8iI#Nq҈~;8釩VnDJ:_ϩ Å&xRy ,'VCuĵ``ݜJ R2T)B ׃]' QK'U3; j/,b///&j''}QVrVẎ ؒLQMr],wY-b[g3*OXn{#}b 3s8Y51 #[o7AlR.!j͕ 7}m?}k?`iM4 ԅF]QjisXfhN_zmG m6ۥH#{-Yڷ-䭭ιSzS#` fsM ,\,wO|dZm[[gX #B>7|OڽQF]=j:(Ų[']O.6͞ҦOmz6jp[C{I eN>Z!}K|"Ҙ2'$ѪkQ[g/z< &b]k_~*sD:tvxuk=Wd nqVJorOQ9k ,_ɳZںg"xUv}PTM%b-+]ݾ\]Y>~]~뿳1]HDܦrZ }UuGQQʉ9]C 6 ɶlHN-An=2&hq  KGaUY"vGo7KrXtY)xLҕNY.K'2P6\u%Ҵ+mhQWZg˰C5e4%$9\"*2ZddoFN1oQRc)MZ/)w֫jb~@'󖉐L 2ueIdȇRJx, ӎG|ƑEkXzp"(oiSڛ6Np*k+0J@i-6vE3p§fEemD3Xےs_gUwY 1)⟻=bV%wnuњp҉4٪?^ڮآ=/jfoZefUϞFWݾ?gs&h]r]}˙3/Sg^x:k7ߙ@e3$ţNaU{_wKSp_| Z5$R˧,+ray㺓;g; Ӗ6)?YF, Z0!wapEK,-#[ZłC.ZŰF ^HX7)BsVXb S*Ayepy{cܝdi?kXx \4R&hE@ɜ4':x`I$1O+Qn;\͖ rryV2Z:MMz* 5 5RZ}ٜSRdAOԢ&f%Y]I1Y;lʲ{qRm[ \(A>#z]W[w}UpX(%=AY\FpP}/\e6 pLNT9ӻs=n6S6vY|NGξe3k7гD ($:H5~\r. &;+ڗ0TcN}J?fʿJޱib#lzBVԛDi5Clb4,+Ւ&(;=YWuz5F) (`/ B/]EQv$++uҬcti q\tƃuNAu%HW쀍AQ t"J+eIWc+6+T18D]iUtZi6"\h"`+8`+p+5!u]˺0A++"u]]Cԕ]%9ZLlZԈd;bdiXpDk4Q:-=@K{/$pz@ Qul0mH)Un6H]/%7 ]!6͈5.(̺ltz=^T8(\+{UiFM Еκziփw8 FWk-] )k f]DW2S FWDWDk!u]ej3++ֆ EWD .u]YWԕNc+6|tEpz2@ueBpScF^KW+huA42jrB벫 >]턫4uE.PS-ڴ=[|>9?"#lϢ\M+#1Ԁ4 dwO7*^fSb:{WL K.vپzaF= l,w\z A|Dk!,tEj Y"Ae]}62[fi8,Qz~oG׌8J +uҬ0IFB`FWkhK]WD\u%+W]Q(̺#]plthsGڊYWѕJ`+}FW\tE>+ :j2 m銀d+ D"JPYWԕQcqU۳ډRu夔&9ˌcFBNjh >5+" ;io+hh_7CԴWʰjgGrZ(e~#:D]Dk\FT7++}һ(;Ke]qו2D8. Yg=5c(AVaYW/zpKHWl]."ZSQzu5@] 銀p i]WDrt5D])1p"ާvGJhL]WD鲮+AHW ZEWDDB x,+W}󺓮j'J+tEػ}_"FWHk%+LmլLh~F`RLX &XOonsIo)BzvD `tru:Y?l{dy엘lQ XSxm.+do\||jɫl#\UUIQUJ)Z\ \X>NF__˓^,RTロͦx1aScŮ-t mWOۖ<:۲Ƿ% W[jMM ]] +g5̾<>zP$Km.C^W ΐpHkޫE2j W(`h;:6H;ቲu5]` tF@J͈,5!u]!#>#]-MD!]kzUiDVOLW.BW.Y/j:: i )CjH+%rHWl]\tu:+-W+}8\Ϧ16(g] PWXk6"ܾ>u]}WCԕ ^0unZuE&GWCԕ4+}8ܾw^" QBj]%6ς21Qx=֧&?onVN`O/Dhq *{pw4QW tX5 ]AgK+"u]YWY'Bg/]׳p]+ц2ވ]f=H0!p+5:u]˺ p2wG+RˬJ)銀b+{4 -V.H]WDtu6>0{aJ+l.BZ)lBJج(/c+v7BAԹjZjG`$]!{8ZRQ42@{ ;5OHV R(*1AK\l+fV2DDDr~]*֔"fB`>"Wj>"Eeəh㶨\"opn?^rG.iw\>HDbZH'4`g+ EWD &u]TYW–Y&HnQ{8\ݳhMO(mbBBK rltEsuE@ͺ:qsX)FW-](ʺ !xFRVzZV\tEƧ+9 $#]ltEMz29YWѕHW]n`wEC@+ ɺuX >o hK]WDiDu:H)8F:۞ni 78.FW_CpwM̯i$ƕtNy@FWDA:bf􊁀-]cFh}"`>]aqWb"Sfub"q{U$mhiUb{]ut4]nIEuEd] PW҃HWl]\t")e] PWka8EW ²J6A:u]UYWԕ7<۫X6"\hmH]WDx/d+{d$f]q(s ueHWlDWkDWD!u]eo+':2 P0'FǤ[ķvKLp,~nt쁼r>JmWN 2ف)2g9oLv"ﵔ-1ϓ >=B*a1DidZ`ׂŜzB!\>/nE.#)m7D] eCjc=*< ~lb?f_uqv-W돎nF_]y~MݮeS{M09w/۞ons :a{0'ܝMv-Y=jmc`_ߛj5z7G2vD4.I$<_$Mqj=4񻛏Rtu9jK˞ eh{m~GY:T/NX{P`gH$5 k/z/d+7X.BZ.u]RYWԕyNB`%]5\tERRvW̺T-#]!FW+%]2(κ4<#]ltEMzRQf] RWFz8u#|+5hu"&f]DWV q]R\tE&+̺[܄VF`ROM( 8l>95"Ji4l4MM#hM]D\5m\wufu W#̻vpj1O[7uUfS_弭a:ۓ ĺ>rn*KoPI>{+S>c{[lNwp"@f\7WoJOs>I!,R ֺ֪Ш2ԺnNf`L^֨Sz7;onJ)z-Pt~J_٣Iә[R6I3{4bj 5}.s, 2y|a~zBtX[o^t%{uϖzO^ޗwk:UJxnI|O|_N6'ysUIv %:˛WӫaanQ#h>N(,x ZewtѲST6b?tSC=3a_^=_i3}8Û487=qR^{[]`W?V.|48[=}(MQy2 o_||gWw /c8/Pt5jDX%=@0xe er6S]YRU-EY v21 2EBbfLfh;3཯f,O7'̞qŶwhC?u{Qxm{/ U܍>+ѡO'@_>4Ի|F"b_O?\?)X܍W_1~ [J~xHsraͥߺ6;77Q +klƈ\5nݶhl'P,aC̞BLgqٵfUC{CYAPڱJQ0.Z1-Nϔm"ǙgL\3Cq ȪVeY Qq=э.Rm5MU kwMiƙaVPl`h 5*T`gPJe".Jx[#ES[5s7g^{92,\? :鹓;ջfOUON~#Tχwۨ&jOխ*=lfѨJjk}.K߉߫Jm_E5>X|/_)f_8 ,Um-FNnf |Y7LմNW+ -«ٻ6dWm$ca{qv`gq/n0_2cr('J!EI|nJ= QfͰ*jQHrNT]=1'N+@bB?12DJD *U2tTsR^;O/++^gI8$`IbFqtr|.}.RIBDmA>rGahΤ3%A"Hl"YA-&bd􅤇? x8Aw ?>|vAΚ}S rn2S陱J giLeUIT&mST 'Q1ޖ}!4NV-98r9WXOg{oKm g~UgwQ !oNTpZ#F{c'o 0067?y5mW+Ne6w0KW}5nj/>t䗮xꫫcm&~v[SJlM;w-{no~O9WHP,4%)U>ŕw_fiJkUůJ{P"rᳳtJ "tW!P m]e|ds6XpˑJ#6:~<ƽd)a$2*E%$ RL-( -#V9Q&3,I_ H4#r28 8 5f7'aksW߉\$@VmJn}k)G(ퟗ<%Qy{o@"ACDICAx$:j:J\9rU)]4T!DH ֱh) 1(8J#c1q#c9R iƾXh cXx<35;>//j?ol@A5b&8A/K:Cf88(62"1'VDֶ6^ÏGJ{%A3tBߎQX,+Mَ֥n4 K.Ѡv)maԶjw v6a;.%*eVxkR $xM8RH$3 R^: EfBo IpD P~! b1%)|"vF! DEDw&}( jQOeLVk(!J ڤ4A 2θQy4ޣ'̈́栩ifԥac_C\牖,%"-7ic,8Z ]!JɝɒT!34Q$BIQǂŴc_W"$d52h)P  d%.;d2] Ag]Ys&B-Tpӌ4fhӜbػrķ僋BVé%kF_{y荙tf(7P_^+C}Cz UfEA.oj5MZ#ٛ 2ЋW}"]'|$we(~?^u:푅7Iz~'vQ=xǽ^}Ϫ&Z&$?zW/^<{3ZtF a׺QmA(7AxSJln@* 0c2jLn"LjJFr7d?rzT1֫ *dlPJn󢰻 Yi+ـA7hV @V`^DωDz@&Yt#ųyh] ߑY{˃)_Ag9Yz!&8L/Wǽa&0=x7.K}t9RVw0 8ɪ_|Lov4y"^qxȧ5l+H{!g*矟 _ױן6t4q4K7nWj}˙0>]G직]GrHak2IWRIGB)Zҙ'+J#1M~T,7^55ٝM ~h |7p9|G/~kK\io[;KlV>tb༞6?b><9 U%"IVi IF\d4`qԏ/>cϟ֏`k=vGɇ0,%^λkZ]}xZ?^^ߢ"zӋ?68U!e^䬿WSd77/e>x6d%Bw8./|W b%YoyěΟ4}sdp9i3od+$ <9K3AxD46 sg pb\R4 e 鴪||,%C/+^=Vs5G)׃O?S(?,jV`r@5kݯ`DBȋH^ iPe82 {w2 1jOY(tjM(gcZz4CJs)uF e ^+ew <#eoX/\˨Bdj\Y%tboyG3?[ gI|=:uwk#64·STZ_HxnhJe "6"Qd֑HxFMHg V(qRƂI&.`1^z\2|U}1HBqdFP$S1%+i@yH.93XCѾL#AjLxbT@I 4mDl$>wqjlSK5gk;W弟 ]yJ=]+rJ)ȞaeDӧ19wghq3]`~sHCas0v\P8.1 ]e *'VQpju{8Ǝ',_E8*DsU0"M 9U>tgAs gَ -дsYښk2㬔mv[*y-)\ۨfΦ&2:A#lb%]7v˵U|cс#q<;6L  @$c[{EDȭzPs6B,7q%B`"uheP9!dp^wrHGxD4hc9'JdE+T !*7 ttZ F9@2z*R򜳠8L]ƨ{A&$rbeoC{?85M{$"Im'o1̃1c{W׍eJ_d^r` N2q 0ZՖJj-A9|hq=-'dׇqWK.wI?vy'G 'ݙK+I U)Ew2]߽vs>rpڡ IrH켵F)Ri'qJ`ogD Ϊ7$! ,%a]YM.YXIzӅNvez'2/WއѴN+GGar|JXDO_Dc۪t[!/&^s]؉3؊JdҜof5;uU۳ӽ//O91dZ'ofk^8 M:k|^.wL_ ܊ ]dץUNv:pV.YB99^79CJqI3W/rԓ [2`SVxⰎIsڡ,X=~tVG xLgGirvo߽7o|7$ݛ}oau8- awo~>k򧞡Wˡ;/k U~tinr2]^\tG\U4$t4+$ ׎'z Xł(7r/lpqr=pu\>7N8``&phN~|@?v~:ɵ2EP \.{ʺk V Ojٹ@>Tv~P.NzܤCs:SlU9JK֖䒤.x:lT`~.Hd?7$ DD>R0F@:,sTU"3@Bqɓ yXHR:$Cɢj*|_jAHs-XE#fHtU }Mmzt56gV-y,K3/-曙iJvtAA] ³f)[~'볊ࣟՃ!//qV|G|r4Rem? dYg.ă/=vg7sЅ~h#W+* zw;y,,%lFnX4Ykw/;wO~?{;.Xl{ܚ{OOfw^hVl0kɉwޯ/K?Gώϛ1ivsֳpfs샐"ڒ\<(_S? Y+,?j;vx\mj!l(bML]hn\Jp]Z*J"RTkL ,Kn{<&!S1[ ^*%uRK*[XJE%8Ueap!Tvh#1>11L͇ޣ0t }aWStoέB no(),w-5Y|hq:*S2VYkYCBS*<'OF78u . tVJ DX#f ^9>><<}gOV>Rqkz| b{HaZz@ō|@7 ǭИaq%N=V#hw}+wĔZ-%W_Ha,r>r ]R9hS`V =s`jUCnuϕ=LRk(cev:,ih 997EʬSӦɦԄBC m+[hkɲNꪸ)$Õl%es$%kJ26!Uk %xK FF*A ) +d>IEm]LjH-k/m}\G",tP,9TɆʵjS}2T\$C-QHOAgT%:uNV&oeldl n#Weàq#z< oXI:YL }ޖ'*R*9x3]ǭGm9If1Of@}.lͫ5 d~r7:vm]Hoޟ"Ed\S uZ/hds6ZeRQRvUYmΣ~t|,f^N^3ar*6FE {mlo1}7=|?)h5n< T/:\}s{7*U1paz uԃ Rv iY%i]n|{GaZƖ}9??!zL\*+i̘ [o-\sf_.kik]EABRk G:GI2!bj| 6yId_6 nSw.wQFF_<Ӄ-$`IPNki%5{O9&psNU`fLʙmP>4IIC(BrIB:|0rn%$lܤ{K7OX#x0n.N~w^GpgA߲]]-}W#g߉`MFt=7jFʞwȞuXiOH]52ꪑSQWQIbBu%Y[vPꪑKꩨFԛJmW; +,Y<uTUָMWW {uŭxB5 z:ꪑ+SQWZgWJޢQ][AOH]5]5r͓QWZk6]]5*ݪP]a$=FRW\dnVnjT:UW_K2F]zϢwߌOیnJ?@L-J_0Ew7r58wIL-Bh!rB=bd5ʛRD6TNɂpc#;%=`ꁝ}aDw*j[9#-C;DҕBZv1cmp,[ "lR9C5TeO$ !(hh5"aB%V`]YTeK6uP.xHvC&)x DcW}:34IJ-OVf  ` kւ:`e*>vF~d/ (%k<6䶔A`. K9![U(oR#AAAC\XZ4͊|5lY֌9&eU\ Cp1 ( ˳0pކs(Z&p@riomJ`KƬTj)߅RIl:AoV[o%TJp*14V(Xpe,f7 K7Q j%S"zU|JdxD ˕3f(d 7aA1!Jb!#fP2Av9b)fI14;x X2@Z ˤ|% LsVJ-.MrF-%Lļս~iQ]YۃKaW/u/(+!Rλ2.u$(E~7h yE€X_U@"j0\(%$6*ihA$>y4_V 5ȤuՅvTn^+X1dAX? "*Ίf`6Z.k`;XA?Ef{Ajz1΁65D-10 zvP P Q{GJQ %h 6)qNJa:B c blwWH[vTh zNq5l,4^  DhRIhݽa#[J3bO*Vh I z آsE5* r#"C[uws I*WRHw]Pz@ AjЁFQ-@OOdPL;PqGoܪGbb G+ EISL1. trSpr,Jj`RIP(E@cDMnBGtvwX:f@ 4GJFdUA?FтۦwZdnf7S'kԪV@E֞Kmfdd, D 1TK.йDuN^nY%sP׮BT43у7[@[qgV0Jq4C6-9EOzʤ蹐X>?n D;5>T2Nj1V=XoCn1jEr7)D$r(:flҪ\ >E!:.H,=vQ@Ւ- UX mh/g358D'#@_/_Z<_nmיJr0Ȣ܇CS,uZqz[߷ tb5 f0,@1j kiWGnO[5b@t LM 8Tf:N k&-E;e=_⟖@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N7'v2N k#(JgH'<b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v\'P)92wBg!\&BZQe:h; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@/ '1z:N [b2Nh{BҰ:/b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v'GK냐K3ZjJw\^߶ׯVue~uvx ~B%tK\Ekݾ(- ~ƥ^yMo4 u?܄Gb=zȂ1{|V5YV*:L{l.zup1\}LA?u! on6t}_]N7*`j[J:1@Vu  ]]W:KoPgeqtZy.>fb|Y`6XzJpyq~ҟr_F/IH]K*`1@ o8 nLܐDtr\vV&3?55[[㞘\u?\] y0)?A*cӆSyȞ^nw?*x6d~~ؔNcl>S6fh%Ф-"Pqʾ'˟vUM',9WuޚmOdErMwZ9e0V!{AO(rR!Lhj!LNfjz*S  >@Q:S /pj!J㜘\!`od~*phW\{v~-$Y W >s火E~~\ivD?!BVTtZF^ \)ZN.T ڸpE^ \ieMp+<%vE=hmw(WF k9pd(ZsW=d"pei8!"몚T;T{We W/J ldu+6}+2K+|ZuO.;H|c#ym>(wZpz7%m₝ L#8R#rߜ _ dr*phpW^ W/`>zP6uv2p(ֱ WS+sϮ71J=?zXuϼa龶c4,hJ^87_J_r/ >\m֟\nZ]1Rhڪ1B Cj%B;eK#C~7K> >^^c)m>?4 w;Z?e[q蟇ԫVuy~utI ^r|:mupak oL>s{}r9_Ҟ! :.Ȝٷ띅nv)6piLt!լnvs&STz0I!wvEd_Wҿ\夝bxEouͿ%tb+3\0ꮏ,/͛_c:Yj;[oҸRݴ~jarg5W| m7VFoT.fHoK.IczlQUU:E*؅ZC-(}}y>s8/*iMa2~>P[X kp;zdmz_]~ _\b7O;o|X`d_n%-~;_a~^ F,W~!IDk-Z#^ReBTu'})>=\%4< O8:9y`A\˞ƈ>"ۻ9[6\dxMM>z:Zdug|Ш5x*YFec4"C-O3%]ȩЁ~'Girb:o{y>v7h޻iC|̺}i.oǢz>͈ԜCx]+yk46n+$ Ithl3f0t9W!ƘdjڤТ-)sBo T)[Ni -Ogoޛl~' ˯|FFZZ*ZO>^;9ջ^~衿bh;Mw Vϣ,F,X]o gOUBi E؍6݊s-e[o=*Ϩ䱔n?VǍs F%ZOI{-a,zU OKksЇTTpT)\.eSkR)Qׂm6eXb(@=%M9*x<"A+Z~u4.Z~{ Ï*[KGayT}dYlvГ.1z,=)c&!}1FGBOj>@ s9t))j_!t 2Z8Metp'>%K28 ej wzM6iV=ZYMyMסId=,9/.Eee8}r>t+녫wÖ]ig'm}'6f6?A;'> 4/w O=Ĝqk Q'&i:T-(). 'a0ɔ{{tR3X^tt1r a-hfJUm9&eA=fSxq4eeҝɶ[޻bByФdwVfT؊=k^\xJ5CW{ՃOJ1~|kㅡ\-h1e!]ZwM;^ʛ;0@>-!Ӗ2U{Cq5wі-g6nO{-!BY-R>MKMWgU&DEUpA#P 6y?}tDRP>FHiBɚBZhe-9a1Z.[JUNq1]ѥ`lSɷQRTmEދ֪:-"|Vlŕd6Ow)k=eڧI; 63 =izEȃ|5;)ՏWgf޴˃MZo HrɩLdXxLsV?j~g9ґuW[?{WFJ]m%@pK2;Ya Y,y,9/b$˶ZRlʖAMŇOU׻ljx_>qgFLrā|^lD('k9M&LqhٽFJV.Q"6OWw6m]2GGYhڞVz BQX?/ՙkd6J8k% DGAت69իcҗ8^ZqKHGryU_LR-V5J* F* NPۣW/^*OG_=Goz8pa8삹sܩ.cpEӳH$ o*_.>?]o4p 娪 22 `QNìQGa> ܈bpԷ2];zE1!nGv^]ڽ:Z**=POO$- Aj*Bc7`)C:IԤ $-qP0=A]W?)-3Jμ¢*Q 1PMq0!z2)3Ӌ}q)&Z ]vw '˗N<+hU0;\XvZ߬6ROS_B:cj!}4;Ȩ9탵)fuc{Yg9,@N YQ=\-pKH喗YGMJ5J.>j쏣[ DHFjd)_'ft#2(͸&lV{2lo01eizvǞ=;E܄gv\/ꌋ|MSdKQ3e1H\垻:>=Dz\ JJI1lD)K ' 88"( 'eCWd cdqcx[9 KP Fcr,NOP=t-Tw6Sl<}="o6tyRp%RO,\'dBm m<84ܱ)3SB>ErE (SIHeS[(\HBysqCg$Q!'Bޱ$D|bɹdY$PvﳗW1o([b>Q5rBP8Tէqgk*]6}?:Usoj[_P~9=^y[yՖu eSo\ Z6Wdi.rsR BgZ㏣z8FlF‚VJP?>Y "&Z{C-4g%xV1)H߾0R/̭FlG^'KycxiaK%x J.qz;4gKs`l}o<4t80h2ΝuyuuZcq}I*7AZMS 49@Lz* 7RrCT(>zMaJsJᛕM|HH\cBl|Y ĉęk{*"Βԅug$^yv2ٱ)s/ڰg8Qms.&z~[3ש^Yڐ[-@1(~x;RFz!RAP`eHGHEBKo$W`7 ʼc'pQ -ɳ8! 8,[qND4#,Jrk }27%"Oפ{O*=¶>dr|r_&fö- SzpR~FdaHA7(_kǁ4Q\׌daglD=?+ʕ{t(WF)J83Z&p{:}J"yBLt KIn]9ȂM$B#Sk yJJ0ug Hl][dRg5V2lkbźe7GO1brOyz1ۇ?$ KMF uZ[jTV$$D$bdH`KP@]J&f2{{K%`u<u;˝^iTa];[zuW.&W[W^?%o©yBSBwa4Q}y_h(sBD@3>J)97u19?Mġ=is*_8;>w}<*ij(5M+d/~ީߣo;d~<ٰQu)b؄25 CL(s|Xq(_UCK5<=Ϗ1Zkgu16a`'VEy/Nl8R6}'(`И."?=0drv*/_+N}ؔ!-UKţHhPT&ٗ m3xY9O(s.>&_&x_#.zi.u֝\۵gB>sYZ_uA OHY׊k,ݼ.]W9ltQfCb_?MXƒLZvިgGyzy87~򝷑yķ"u\d膊ucɧo{Ko&zB4?/mY2ioatZbZP m}mBd56J H} %ۻp|؃&qceTKHx©s>1ȭ55:u 76u@>OUS<x bP A 6Jíݘ#ʝqIUN* X1lY[wD[S3 e24ѱ@C'f\rcU>ĄҙqJugDMBi [f\f%% ٰ (pAT|X`@"vI% u"-t [\57TkHZ]TTr9\EFetHޓ|JJ2L0`=3%tEwٹ%X;[$(Uaa1ɸ-X{,)^͙^dfq껽zefYV~[vp0~m$ =AG瑪41Y $H8qVD:>ZoYsa\頄W4vĄ*dYmJ;>f,{ڥ㶨m Qg$A \JjE)d VkR9AsHA$N&o$!#3QuFA1 /8IB>fv0fg; Mwz2Z]uעwiWwd(&reU JKEm efEib һ`>?>Bl:/QhG>!hGZk+@LXxH!xQkdԣ, AD G BP$HJL6hژ}&敧I`O\/o^#jg2OqL>lq~|,Kl&ݣ|X{c scr[b>8ƨ Gs6¸RDk%!=z鄞y5a 4IIV&QdI DP)Xr>F@0ɍ5yyntR\r4')IqII4^s `_ n!H҈IO^zք U]y=[̝<[u%?8KcSe:D(|r<. kwiFmgv>mZj䛋BٿǪ{֔Vr5鰣(ޙz&}ä}h]j}>FR{R?3d,^39o'm>[ͳkK ɳy(T" j^zU/MmD=Ti븚\6M6 Q%b,ȦiCPpLW[iY+vk]Mi) dȌјz]wYHi^ޘET OG)񎊳nTX;ϳo+Ή7f W"Iݺ}C֝`8Wlϯ WZ7q]PK߯laB=Z+*"4տJ:1Jn(--XQLaHY8vɂQy:e~Ύw]ѝi]Wzq|޳)s]_-vN=lU's!X͝I[(Aׇ'>t<)ʢ٧Oo_#F "gܺh6Q a4 ܠ7|󷯿q o߿X.OO]7Z~¸ #fu-Z? SBo+/;X->kȅLAnsa\ W_RE⫛5<r*Y/]v_>Y:Q? ٻ` P[{jY j!7]|ma|>?d-3l$T⽃;ۗK7`=p7UTm].qkV1oUc=2U$Sg.LPWqӬfqhSz6U#W lPp A6t @< xb\iw)XNʔyI̓r?KJ$ ߃|PDWUe |6%,{xƌ ~ V;'C vRe;^ 5vT9`}pD@(K!vKϊ]x>[= $+xv7 &.+˖x? φymf*_^!꘾Y:Q1w7 \ȺzY GMGq)|;@vmVMGEl`y ^I ՘]y\hQV+ͧ^`Prw[Jj`o@XsT +[3@ݘqBr :xRhTVnlN!J*+POtɏfFD~Ah USyn9T%7:E]Ǧ&ֱZʄ2&¼&BP5;Ljn`gDMS6wI W/9&Dur1Bt[4DC6-_ W(u B(pӂ5ļ=V)V\"Y,p7~haN;gI]tno[qƙf %|2AzsS KAzJF 0Hs!͉DOF\%rN\\}WJL[q+D D0OF\%r{]%*e{GW(;ՂxE-{#\BO;т[e-'!ᴃh_)OyMC 1L2΀P:I(a lכ1DЬIWt=3gflbiYD601_A?ltdP׏r[Ƣc\\n1cp۪_\>zSۯ^wμVPqڨ L^ҮegpU۵HcNFOo_XVv:7-dly )f1gpb#VfYbrAdt=8o5Jvw;KlfLrJ͙"<hr B`Er-$}2D|27Zݵԭ'Gt-/+쿸JSdU" D-.WZwQ +-$+b d ~_sxue$G%G%=oզS%њ>!q%]<^wqeDc܏} i+| NӴw2 h\$?K^3} ~|x/?_؉``9Ӛ* @c @,i0k1eߥ O޽ϝ:]S̈́ߞ<[?ܱ޼2o$GXk,X$pBydBH k.R4Y )@*]y+}tE^ m06N& E!_kj(])K6oX,ِը MlM@zMPP#;%qRﰐILH4bœqRPFތF:Rb&y Qc(,<a=5э&BLi(Qu9l.0e u,+77 MWU H8m1wT-1p+*P"fXFZ- FC7 *Ɛ:>zT!$UHQ+NzhA# Rp"Zf'f1,2Aɨ5V2 *#8<$j=3 K,R0T̲ f$ hkwdMUE>1 "PYI0b΄E+Xrk,)NЪ 2F+<\fCSj`d_U9\ Р*7>pk&oRZ-(g {l!at2Wʈ;9[SU:+'Y~/߼)^NΪGub4xN̅^[c u:):lp{!%*W-]jnb9F?(d`Ŵz6mxejUF ŬqBi59,Ft +f|p6(aܯɱTiyalEħn| >;>y}?~|s &藣7/a$,m% jk~ߊ\kGh9UztaeuLJח};.ONLn ן.f|b倈Fsnr.eFD6jY"D"^1nLAN";pN[9VZJhh?%쑑%|L)=P^w8%/}r:}\GXaNi0E,ok1K`P\ Ox{ȹ8L} ilM[hu7=>xNA<7Um&9W׵9?by4<0smC:ؑ* "vכb$9kV)Sb P= MRa X:R[ )JL`dE׈ZT kDIIrX~(qV`!'%[Vˌ.^U!OTBl+5X6M,ycI\yEm CNv^;?BAR.GR(k,9i!LrR7YRRDDNa4XJ;k`,$Ce00( GsA_JT9AIzhʃR2QʁIeꈘkG*!9gP(& :GbPňR;( :n^om=w 3UR]DJW3{&RVj􉔕DJģ0S>?v_ST-Nktl20OCJܯTl =dtvex{iqZFz_K.%U[zQ]tv\i _ז)my>\8G }m ksv8hWj0 ceXi0=#eDKxe 'mB2R:,Fe)"x"Ibơ px㨯CXv#_\K݇2=]B%˒Di u̗YeBbn9eқd|.&Ru˂IϔHF.5oFn!~B]e*U:^`2znX#f8~0씭UoҴ~{HsfAnlخuGKݏ2B7՚c)]hL!cc>}td,.KSlC+*dԇ>}Smn6LRB.d D5dRf.[iD!1Km?rlO7I'O)=\o ^V_Jhb^%8w&'i2{׿Y:]?m`CflӾSC. =! ‹r@s *ʵC5'N et^y!b)Z2CP(D@I ]fv FZWږᳮ`He{ߺkd#11Y^vɕosI=NLm"Vuzet<Дy,y /&\μs /*̶T4b2eXa(RFF[9 oREQ>mX@$VfIZ$CHT& Dp`EDN؜_ƙcR\/_˧_ϫ/.o{i3`=5T}U[@հgԺ1^=^*ݾw1_~*[-z: -`k1sRgR 1*zs>O#LE>Ytt*$_s9_r˛fyoKWW[\M-"|b@Ɓ%CʤөDʒT"ꠣ>9Y%!96>)qG̮ s&JcFZg1L8}[I]QUޚBP`sT`ؼ9/Iz:ެ72ruwⳡ+6Sʚxl8JHu&Hj~ KyJv".:AwtN mԯ@){cƪHrhҋSE }AH$Bٰ]"ef_ֽI;ImX,`~Rυߝ9S5HTds9 CRce_'"md>]E&E@v&»@oo${q7}aC[KNהVsVP슥͙WȞ #jWrFZ.}o^Gq)0l[bt 5h\x ]@KMCc%D@[͟l{,ux,g/VE<5홪8},n˾I]9W3__q-ad1C?svO+-ug=6| lfݣ 7$'/#]\ۛȭjI^?{پuc`q`XW_Ww۷9ivx۟f!xexy=v~r;?L췼|yVV}Vxk|t/z˸'\rce78[]sJa2Ŷ{疿 Q#{ 'bdFFHԥ$Sg|.PVX+*a2~SUW/hzC9a\ta@KYG$#$8 yv}ʾ(_Rm;&(eL. )ftk%hY%JEj*c (-AVRyC!H"PcAmFnAkB7eXN +C~SߌOVTt!;i֚WFjF|I :DZL-L'Q%`,ŀIvcJflFn4Ӆ8P]B?£•h㧊7C/逞}xt0Nߛ;(EN>IP;2T Zy^ĘaH,%@fr5 )6 :DEMt ZwoEnļ݊;ڱ֎֢.:&$m-3R&KP4C:0B$ݻ㢴ə ujES}Xca`a 4lEfӱ:9U uMu|P>lFn} /E#65m5bA#qC6`E"MfƐdJcY @V- IRF֨ pްm2NިR &`cؒVkPۺ|+r(sltIf\r^q<*]*Z5d5Z0-Q龎<%E x x)wUc}ӇGPa ӻ" % ُg~T6V+vh}GN1vF! vPjp|2$=U&ع_賻pz?UNgp}EL%le8@M 'ˈ3 IVƈ^JA삥|G1׌qn{;F<^N@ٸx_Z=ݝЏ$+`!#@uBggDPD 2j4Q4B0()PhVR6Rm[Yvjxь6IO='M2I:smY4>KvIJ刬L`X\D;L69aʨ#qW ,d|K08?:>_Z#|%K3D5rN(t#Թ."cL`xC<"tԳz:1cй&"WϕV&0NP "kf9.F`kl}ON`RJz_e\|7'ȓm-<1` ޱ6Ó~,N_$Iϧ Y hw"cIBmD<%_TGkjg_>O:O Po-yw?tyP]w<}iԃNyBc2j ^$̅Քh 3<dm0m.[Z\'nFYomAXk6ZmۼϾpe h3}ZNПsT{j3<9q%aT",;?w=h~QFjdfQӟ>~#MolV$8ϚzCc/6^?zqbZ*~;},9g RH34ϴu"(zd4`ԏWOWx]MχE> hr~? KQU{spU嫷jE揇2_xzsz=MazzEr<@!^F]WK5,/gX>²ţ_s*mUsce-0@)ؗL77Ztx9CYhy7œ cɧKOݷֹzw:nN\qt4yL- ~3E*N_: h B|a 6^<0R~+ۯu\Ӆ'ws6}!϶.]i[]ps^ OKĹGQ \-vXllS3 mXJ:K]-A+*<|h)iTA Y[X$^jU_Ϥω"2KD΢|,7Ȼƭi䖶ϴig5W,,<(#(J&idxyfzBQ+kE[JNn:]!J:CRFrZDWX•-thh:]!J;CjK]_[r$׍$sQQm[.|"0ׄȞ=“VGV5aWДȄО= ktn5޾{6Ri"ZBBӈ6~V(9hzi`h"-\5HWrLZAZDW%l1 Z-DY[ttvRzu+wNW6+v P6mv6+]ZCm]`tk ~{f(jbi]`KDk RBt(E7GI[DWتĴ-3M+DY[`wt?t%ֶMthb*BF7e}ЕS"g1pl ]!ڇvj?]JYvt?t)MvM9Tk JBZ5{IWM2,=AzR )TߠBڬr6ʊRܓ C6z U3l@@t_h a0kA Qϋ ."As-FfB^H^b·bѿM XS)PB/d<7דfd@g9OQ`5)i&x ,th]aSZ^aI>>9AK,Yd$VKzc2Wea 3|{(s΁t3}t6Ҷ.o&)^44ӄf&97PuV(Ce(F*Txb (U x^(gy9I9Y*Uu!O1˃gB軏-JBUL X3#ZxK(u=4-XihT-t(6n>to0t39]mpc qԸК emЕ]O1M BV(?ΛNWv+ƤEt i ]!U4%]!]q7fܴm+DkU Pr*:C\*i[DWXvp%m ]!ZxB}+ɍdm]`EcB5+D+ojG}+%(m]`hk j#)?]!J #]iN'M=ESbjzR4rK ktmPo!`-ilmkh2ښ-DmiQnGtieL [Bb@V4Фi]}R孡+kHk-m<]!J;Еeכ#Gwf6Ɏ7C:7Cf3Е]Od̴j]!\BWֈ=+&0m+,w~{3F"+DIuGW{HWz֬5tp-o ]Z 䢣=+8m]` Z6H[ jtBFutt%qhGWشhi 2tBwtt2#`µ-th%i<]!ځ4@Tg xM)nx:o -D۴1WAlVcZsŌeyJw}-&r*%d0rF:k1:k^n%Wu2MfefeeR]./GE =,3:7jω4LF31Gύ6<5nҚj4 (ygGZ6m7!`ٞ&kZ;hͮoRwMHWfi sٞ&t[ ZtBȎ6teoXw7fpwd3fG㛡 =Е]O-L0y$պt(9!]%]Aϯt0:ngz_\Đݤ@<^scxx<x2z7LWu9_%wzq~:T𜥙˲TrXh~xl2 #JV6u~@vΏ}k *j蟎FOޜs^n]\0҇m ԫJ!屛:Hqy=1rfs@ Rd*7R#%KM4ðOW#ص&I 28ARaaj+uEqf hC@~72 eʋQ(NJףRė_Tu9!ưmDG,zTgn2juU9LXW?駣;5'(-= ȱ*ͅ܆Fh8e;TXٿYB_LG]8}'H JpLQ05u1RX 'xf3d˨BdjTy$`j1 엪}>T5t5ٸΦ' MA 揊K`&[z=ofl8Y_QY `vo Ӳ3[D) .S嵖X$?F@5cҲ7¿ Dͨ<ͣ̔HEATxPiHxlN 23z3QUƽ#ƐmTfrJs. c:dQ[VTAN+S&)9i`FkyP*湌4f,hPEMBs)C($ѻ8;VJSM.F2*s#"\z_xg^vf8 2 tT@)@'a 3JT6zũxkV4%x6SAK4t'[vߣXϭo_dں2gAl#l3Wj*ueMzǏ, 4t>3t?t:M&1ƻ͠b4z;Z8f~izuh`)"z i(b: P#h Dh6=^\~ഗ0)5FeMӤ4n3q9OuTc0,a sncqr|-r悄 ]j ʥ0=1,hݙac9;a s#Ip0Rw>'{Y`,0Q-1Hp!ErivWTulB0͘RzJP)^Q&KwY(;¨h2ʶd5$IPK(`;%$Zy[&xCGTtº#+$S6УLh'tXub"2m%D[Lc'a}j#L 1NJf5~(0L>⿿*iojX>m.u #ρDd]o^.i<ӦQ>,θ%eeHLمRrtWal:M Z<[k5Cr\Ȍrc/ʘ/\ټ~Z|´(S7FX%Z?/[WD^Ϯ?"pDU|6ŽZѐ7MZgo ~Mg%!8y恃]omNֽyb^_k3O'?^_. 7\a/.k+2 .ͥw?+ƚFq$GjF4,oIGE+XVr7cmӼqTdӨ suL/S\H}2c^3+ܴN7R7t'1??O?|ןxﹰoxMΟhf`8k$>E{0wjjZՆ_U)~zO;6VNZ S/-B]?E ZZHv ]b{+LSn/w]V^ǍWB,a&^]\꫼wUGZG_'ISߺd$gR<` מ ~LҤl7PvKUKdtSpf)GH g0f]"m].fXih].[DɀWJX++Y(*D2 :\̱ 96$2GD,rc zoctkCI 3x2c,K@]H$vIQ:ePbYeQD$1 F&hFnHtw:%HduՔ8s<A ڂޗGX-GO6y"c%Ӷus%ռZ r߇}n 팲b~ ܪ*k,CcSdk:`W@c}xfJ1I&y ǘXAz&pmpŴYD ,$*BJVF%0KnCBe2N*3'ZWmbmY3s-#i> >VS63:Ǔ]OvQji۴Bf*=Yydžzb>\Q,8˃Q)E[F1Jm $ H9Q*7B6xqܣTޢw9 Fa3zN'ʑ%5rvKFx2rStٳÇpҠM6RE4',.E`"Ba"B=J]$M^$ VM"ZvU\MP"(A`|rGpxg&:ŕ;R]7vD'BºbYmZz=^ca84l(Qh-nqWotրeЃeh7&>L t+mGv-a!'BZRrp# (+\2X>O!̫`1VHQ9fڙdf:si;sA+v 7hi?!xyC &q8\HOI5+^b2Mt,q. }o_!,+`4(a;%<9^3&eR.vjm~!%ċumKiURmkt5Hl=GG1b,kŒnEC p3+l!@[m"lYvfǽ<ҢG |yp@(,x g.Og=XNQ3[,,.xiE29j&&r[eLL|RiMFLTnd);D.V:t+sJFަ6xN9>!nG̮o D*-O6sJ+f ;Kx ʢR,U΀:4-sPBn mN.><y -E> 0#y`[UbˑW[ 1%YYvJ14"j `i )W \=)F[Ƈo6J:2FfkId.3>!f- (r!hYw$nr#nf-oW~~h]u z2I ϣqbM|:'8pr8$g|z=i!+^)e UzO^rT!^M磨<bB`&XERe N ̥}6e~!;Js Nu7aDKkH7֐d$&Q`9SN>#dC S4r׳ags 't0H5?ƷXkf/ᒆa"6/umr4] =:\_3**Y!Hxe-o\ųJtD[ߤQ}Qkw]z`I܆}s*j]Qzu5o{ZtG6UA?l_ͺCjYQjwmy|xw =j -n/{q3}x6~_"p>nz}4+\Nt{o,]X9jmإĶ {ǖ߶t')5?ͧ9GZWm?{WƑ@_6ECob{c۰EЧĘ"ʉbo$u9H{$gzzjzz!u=&(HX[݆|8N.)n< ҆*̓i"s6H#, Bj똅c 6l뀻n>S- U[]p"x`:pnAj2Qx- yJy] 50$5AZX,`Do/ qfaHB9U  NH\ 1q/p;#PKi973䮞O=ZSU+r>,zs>۰9;V AT^c&ӑ4!6)ɤCj؏%UXJؚ 0Rhhh-JT X12#XPLjg9264͌uP5{+_sI3`X\_n~٠czn>[FQ;MbPJIǵXP%(aلH2ZʈeT^gpq[ٱ.񐬇a wg*qQ\HяJ7x5cg__&*K/ wh$k0g15۠o4Xl]90ɪ;5yPpQQx⨣Fn+( s  y[;y |u?}X*d~x84WaWL󯚶s%@ yu̅hS\Xa؉ivS|; <s[jVgٍ{s™eH \2dȯPM3̈́I/{cX^8{(dkg/_\ҙTc!?s:df} 'β￟'3VzhYã /Y8g n<]V<}/_>ųㆩ\$jak>oΫi3"FS0BxTaI 6%xn Q{ՖGh P3g^oU&d-NY].`Ͳ mMxZU]$a(^E0 N%S%9c`)ZX"\bYX:5ײuq0qmRQ4ْp^'URQ;&9-ȴ .u],-" SQ1-z8u)h2yQ9gТ3>8Q{\q֭ kg?;9PVnKBϕ CحJh^!뎋H|<[JuX.Uhʦ-)uѪҪٜHPIP +`q@iiƊbz C(*Y#'3F%1{W켈,*–~\zX v󂩽gSǺۃ?~ vzتO X58Q▯?N.qUg٤?bhЎaRtXc]h6Q ^f4ܠ7|ӳ,gѹ~i~,RKڰ F}?qmWOirJR7{?{=G3cm=q=?/_XT^!&_c³ww~x_7y·4KO]2DY^B~teC`/L> cA6{!, ȅq-\qӪVPXzc/H|L߹?΂A%;g*3q6 7 ''p\-m=T)Msh+PM:6~[&U!de.! #`fqzh$egik?L$2@ )DO^RV +wX L'i)vNR0M4B+Xr3prW*I)-\}-pjG֮&z dQJ]ppUmjW FZ\X$#便oWIJF[*-?Sk'Eq (/6{)Dx4p^UʰdC85so%]yXTh'O4LTȾ{OeSSlώẄ́m չ?x!763 e߽ A(Mj1;З=V0 `~օ{foA,o7Vw5!Ogtd~4ACyVYu7--h8oA}z?{KוlG(d?4h$q3'I+Ѷ9IJ[3[4sѴ[o=<뿶d9ێBBuX0c)1+hd[-Wm5΍6JDɚ(^8∰ K(PByo\ۀ9f3-"!L%mJ)kRL4ȃSp2g!6>DÜ|?͗w1B S!J@y:! Qik[4x,ïaW=MQwOl#ܔ[2v~.Nywo3!M7=8}X9=,c H uּ(򝀥 Z0D#.o] >|[b֔yNrdAArm8BF!<[Χbڨ`F0"aM#ѪQPAp0. UX1cԼSSHKDac4P4'( Z60G=XL DCz-{EeEASĨ۴heB#ߵݾ /?f. 'jIuk]t71/܉0zJYC0mN>qjLG/# ʹ?oPf jX ~65 ^T23>\ح⪷C}EsHܲ}̸^/K?AnRLbtk) !VWԑѵdI?HzERYNzr+bnB/gkjNޚr83jf̭b%h^c:\RנP ),)y{SZ?wHr&BI$f9cv*3b.e+>֝u>V+A1w4oTX:.=O7Qy0 W]^+y:_q]_џ.L릋;ZvRU 4~wMPw:$۬=M)={ڌ/O+62c8ֹwԶ\ bu"ouB:86fNy-Yw!m<F(2gc4 AJ Y8&`#ζ;s ]koF+Dl/wMvlI "%(5u=Ë$;#[n$L39Rr0I$roDeĹ~!U}ڝPqre73}rG}̤q:2 )H\aÜL:XRj sX)(Gk'TdЖx3Ffd j0DȘM1WɆFƮX2cpP0If<$ ,{x rsw~G*6&oM>Wu,qYs2Q5,gÚWyʷW\]uQ絻Ug{(iY80N%9 c0RREDXFdm3O¾9YȜo0ywE}:-&^^vR_b 6X5"EzDrCؘ&r2`'&3sDb(8B 1ym @)lt!PTi%4="ϗ#N^pVb\48LFFH %SM$e%՞*!nnD+}W[~'u//E9,nS}ҏ.'>iԕѸ#FG-ڬ9dqd<'l4!ͣm[ {m۴tx(յ -jYs/7n:Xt-M2TWeP}&CN M^E3}W')0gً{Oyw]ߟ$,yl+K393HU&PvWIQQh>@o)w.o'3-zЮw]^|TT3wMҽ7M.ZI~xͫϞ=y̩yW L݃xS)n!qXzlFiY#| 7EGX:a|K5g_,ԕ34C&:3?—s&vg>qtNT4ߨ⣩:Aڛ>3S]?МUOkOj:aKϺ+ʾZۯ[fɳ NW6кh:ҶOtKޅC)̀&+#ARuM(.zTYojkκ"lWuMn^o>zOߟJ뺭=R=iLZ')|rݛ^sY5Ye}??}zfK5Mmߧ]ʬ ɇĕ[\wՑ=ݪW>,^Uo^_Qœ?jӋb6^~ɛp>A}Os\^lޥa60?גݦ8x]$a20`RM&?Wh!'i4tzxҮUD!:":ՏVGk9 lY4*x7^q"rOeޫmŶs!=z\|-4@ |m1_X30C׸ͯlbݎֲ[ד΋hhu?!8C~za(i($gsJ)񺴄{p$؎'RZ #T]2?a}kp wgaV։nحX&I5t3L!I09'4d#Z'*Dm96 18P3o1=Gf0>3WvR0E˪~|i/z]úg!j~v OkkuLGoy"[ Z?,}r4ݖ~4m,[Ub(@F)ї3`-dd1Eh-aeה:s5=i7Ɠo-[mmY6uAUEdo7y3jI[(Ybb=)x/م9 GC =ټ,]6*9Ĩ'Q)"c`Rz]Diq Dc8h޵:,/]Ӓ{&/ _v A8Tˉ ls3:8j52)%3ltT0%sPq߫FUGlNsEQJ~>rsG}|ݧ_KɄ,ےŃW&>=P;#of4<r&ÖMz+p>\D`ڻG0 ?Lo"gp3q}HW.ͤ$t*L$@$KW=R`.yV/ٗę'>:Ttyk2"ox+7z*W˜_vrԓ_#o>](HAlm˝\Myvs@ۍnx̕s1VR& 1h#kGM,X5u[9j$Gu-t.U;Ra4ɽ{(oqCYfJ1d<%2@JF/tJÃ&bBl|1mR^ S-wq#rlJa^iSNY jRz\TknSu^XCW\"5Ǯ>7gu@bv:\xڥ0%XzTNqj\gpq+ɾ*,8[L}%hZ2$7j yZ^bR##J9xojꆒ+˜R; 9eLR2xi23ŃM͌\Fo|g_+@"wq,P g"TcBCDxPv7ܱ8ۀ:@:7|wGakHOzjx6L-޶t?l\Z4v ixdGwÎxOUKl0g(0EI%3ȦD>SFˎ7D(mJ$o! ] n0woirA{$!(4>/%8M(rnY$-J))e(l\O_ [xmg rD{G,h"x>wD}(>E{D%i>h<[(DiaBfaH-T',Kl…݅#jE,4d\"ۉiuMA Aԙ%"c.$Yr2;{98nM;gi۴X;ktZmA'֤Bۤ( $Y"jBW>jC2G[1 >F5~k۞T *Ud;6\9ٔhI"#AHJ`-I> 8F.Iq`ol,ֵBh[bT.fm2$BҤvc2烊(u*BR`RL;ﶁٓ9]}0A[#3qX!S BAVyQ`Dfi*g$&tZI¹19!xT!GfO#3DPar9oyOŻ~ }Pw@u^@_b]J[ANUoNI9]&$ ]X>TatBQF?oB +5IbMޏwȳ('лUE>#3P@& vp ҹ=]aapĘZG,OAAn<;?or VN'js:*i*#e`$2[_H^Azqh=Vnj=7Uov6_Sbǿ/}?c?wU zJ${+6C/Yqm=te¯WU|7ߟƧ/'7|e'l b/NBs3F0z DZHMń &bWOTqyU!b3C` r ӻaHd]'Ipk4Lj`ȴN!ƘM1;١HYKcD'ʥ[@x8DM{+-"͘0C\DGr<\sQhKhMc<ѶК oz2xn&]xU]Xsꆟ՜E/%/E RZ))CrMk<Ȕ" TR ^Yf| :͐5[I.7Y-t4_>]r_r`SIRȄA *K<&'_G(E3kd h 1J2șp d}$H+tdă&JeA$tyXgV|Xu/@tٔp}hpiMBUK.F{- Eelwvr"0U+o(E)T2_A.G1WA O4Q(|:/.EBPr.MSNkɢȂ\0iVSE|J~,H^`(tic],i^Z昒1!=NόTȣA" 6EV:B[_v9Peru­Bgcvy PNx+P34^sI[wMNR#)! S bZIeT rNY S[)I*bXb0TH̡6>.x ϽẀ&[̧ 7:;-Oe@ SR\5 ˘\d TRrϸiZI+T*I{}Ph:Z3w2,3FtU}7qp!fV !|%Uu1W\ ۦB[`o~s;z&糫 ;o)#;auSA{ʟa^/f1tM sƖB ୍(b!-z1q_>\Bt?,vt{q=M% R|dϧiл 3)^ok-H߽Z68~%(/K軟x7w`~G)ź;0F9͆„?Canyu{sF[{71|&ǰ6'dY)M|2܀Ił0(_ir%#(ɤ*pgRlˣ]l] ([Š2'%8YVi%s(1Ca̻F"p!a],#zId *ˬr1&dVttc<|@ZmWBAWO*7ѶR.Al=4L&V|g:~yuM[ z)shQ4`bNҠ*$+40[y{Lg(`.c=̔sII֔d!:RI0E2BFtdqf PXkȱr @Vʮޓ UXA9[Af^-\o 8yl;|g=2fL`boL KI궇(|dfjL`oiNλՇ"4ajRnN=תyVv`l#&zۯ޻^?i~& x!$Gz-m=RD07jі[BamWӅJ۩W&?nQ/{o܈@4?}@jlZtLz iIJJduuyo0b:kϦ2-A|j=,Dmn5q'{gbijEẻfXc|3>]\Ы ^>퉿9QD/T[5ʝ~ol*RDX ` w,ZdUȻ]ʮʚ9%ۭ+.8[P B D+X@o! r"r6"NWy#͍ #%HG.L]ȝZbjeB%v wŶR#=RWDbroU!̾B0mWWJz1 \z8`H̳}%qZ.hX@][u; y3GoEQ_~l +@H6:kw V`Vh.1]}yl=OƵEwn|i^D0Ar-1˔"j+$'[ŝD'w.34:^I[Ax`oZBem<b _Vmnج96/F䘗 ^ Hoe&ҵwA>Xu\nh#?ˊ3Ǥ5[XåV.0R&?tve2LNT^pWS:MJcgJmcJ}([Ϭp#*GhR,sD:{3%I~B YGRj4hb:e!b$p hQK,ẎȰ2 xv\hIc< N&OP9^?Ngy|w?| ͂ o 9; XR.Eom}sh2`6'9K 섷Xo/d61r!2hzYb<%m<V _ɹ,`wT"=d%D8֤F:fG$$خ 8i[!-;h[q޵7m$bҼR9޻9VZS$ӥݯ)J"H,*1i ӭ)j>DsiAm9ᴳ`&O\O-O̚NCFcJqI3 ]BDPS V'S`CX?쌜S4tP. Z*c~-A@]$l'bR2/i__Ϯv_֜xE !?ȏ'gZI(-<'\triU^Uj T2^ġ=T78PW)%SWB%^P 8V}LJW,*?G mTfܸgE3-MٍQ SU\Ⱥudo^Cw/1g=f YJڮ7oU&ELd TDV7Mx1%wֵPA2A]bœ4Tf7kpm?YݞMmQgMdC,>R̦~z_Xl lnM7P6Z`puv.u5W0:l[B֎]6v&r#iU>7ԕfs h|[1[غ~ܺ=Nf|8pg[زlun7M~oo9pC+-7t<7ϼ # og[t&x* z!ȃFi)qэ)j+Q $r N5 -#V9PQ&SC>[COnyɎZɽI&)݆ݹ|^iD1fM  J -'Y\57T됸ATTr9\uBr*X e*qz9XΔU(T9ۑ;]3,3B1  o̮f`8lXOA &҈m$ScU5Ù&Fx 9Z')bs1{1a+pJl m;blbFajYxm:Fَn< +&vWܱ/jێQkOI)&p)4H/[c !qʠDbFH} IZȨRVh:f EfB1 D9D:j3ram/B30 "v{"$"b_="p<Ę pI"w&}( jQ R˘l>(-6RpjҔ-X8FDƣx4]΍]D|qZD.3.iǸH{\qq;Ƣ si`\W2H$LhiBHkg4 "dP) cagܱ/퇇aw?m?T >9AϦH& @;^|y_=/Zk+Ya[0F:Qɨ# ¼מ8 Htp{DxDȼ| P =0|9`ʑډPi{}"@mG&攣'T#\9nϻ_/jbjۓSsK4|x,uԅ\DO%NƨK(@ LHRa ]PDk%!=t<" ==ѐ1'tD$%ORZDQ  AȚ|D-''ό ey<GvA'i% ) ҜS&j-B㒒h6p"\"Z'Wx:-!~CQo)_dj㹹MaqǍӱ6-w" 1a Wa0Hs4G>̭ woBuîV)[{齞l,ZVhPPݣjM|[fRxD?)j,ͥz*4dk+o⒯kü)8t1"r''Ey>+ȒA/EiA%?؅ɶ'㢺}4?k~۷o~wߞvSf̤ED5m}푲ƒ:TIc+fDHqV_+nQ+j=;% YiWUNv+?5yR*7aw)zZh^u8F!%} CyYr,P^7m˲.1_9/y:Yn N~k`BKTQRˡ,1IJqЌlD0v"L*z `[ɇ >u]īꤍ!>4Ը%$9!\x|`M\"+Ix:I,5)%7}%_ʉxPDjp>t'73[.mF'̲wokSr_dCY? =i,o:p|m# ~j*]O8B\.,Eu>uPma/v~t9Ex;O)~Iv qE Ϳޅj4U_ ;؆Pxd3tR[w\呟t MG&9F82A;UxH3lvަǬTs6id3{;VYt`bg"r^}8:tNO4 ,%+Kx/YHU1Nw'rT/"4I418ի a7E!0vA,W<]@1 !xD[+$ <9K3AxD46 sg pb\R e鼬jr(Yl:^[SΨ)gJ9}Ri2nW2tѢmy.u)"_!/!y%vCLqa8؋GLq@O(=2t{ҞowF㩖{:XFb@0$mwǃJs)JF)V\Sa=C\4)c$wz|\F"V2)XN9;9^YL֭3%_3_LntIz*DEsCSr(shY>x "D0l tPo5B@!m,XnQʙDiL&%uy4{9I(k) 3IŔd cJРt>&E&R;.R K#,hfoaS=Aj Ĩ(@8I4\pQ+N-՜q}տNԿS ^ֻԓ(OY+Mjvlw1wfq3cisp8`2n8=kG HTJK/WPzbJa6Zcpq{U㨲At1g"B=m^EK*H']p4`M3L.;VXw%җ1ô[=0(\Ѥ"86}/H >.r-@G` "%9 $Em Wd-̘9+[*?Ӑ;B0] ab`W=qVxO>bea&˓{'WjKB w6/ȳR#qV?qUT(8Aa"v4!GZqN?nx}T/Ж͵ſF9+}\D46q=Tk$7\ k};f|Jrvގ!P@1O@ $,̞.~[=={rr<>P˴EEs#S+rG3ƚ71µ"9g΄>+n=SӺ75vS8\./^*gLb.4_ۛ]q @>:}T4)֌uHBm>lUfyh\cx~>z1͘l):*#G]l0_.`rOo|gJ8BC@/ɾ&5b, IzxHxjJ<3=35=]տ7|oo^7? :E30I6kIw&w"en=_+uGZV?,~U|7m<.1)xD9jmY"CΧ^bZB0io-f` Si|Np\Hq5RJ(tqsSzbA(ed25<:}ON{/gʅ}{, vPŸuV\*YUafq9M% /b;{NӠ7q S[y|5Mr+ڛʺO?6 ? ֔(gCeB qK (tuNGUr?F;L/mNۖԃkRͶS H1t]ӳj!/gF 1ER1A!J7ަ(#E3Ѥ5abaw, cku-s!Kc8ћȅ\s@"Fk4W.EGE|stxo0(fǒqzi_3<3@r ֥Or!7RID,e> 1k(U))2*vy5r3u/3)s틵||mЉaG]5~.M0]:˴NzK> 0%%>-5=L})T \6o+(I\eR[kͷkc| KT:K|pM+6G+h~)1˴mTBx)ќ C%" K):2x=eIE/p>q`1j6yϪdL HF̰yJB8_O_|~1!cOr.lZc&ǥpI/W39drFd 0@ǼUwl>`۴kE4/Zl\b 3+mgy sNy{n>  z$irVr[XCb4!`OфI03chH#=sǔP:dilNf614Kf%ApUqc@W*qAH-:̑#)SDЕa5rvmld t> #ii&S,̥4Sb B "gÂeG*7Y|_;a[}^~lAO&nyxѧMXǭ9r aOk, ѓ EDΜ5K1$jKjlJ5YXme+ meYh:YxPY͙\d&pzrG8L':ᇳx%JM *" ࣵ& &'f04jt.D"f1-h%i!e`{A') Zy.L. rPVfk]b ^S1FjZJmWYjNjwvcJeE9Tp:c1lqC* l)ZUfLW!2䊴 Iu,FAHMd@3R!&XYV#g'8G.?{6"L~0Ifl^gF?mmlI'ɓ_5,"SeBRF/rü$-ûk?U .∵x}lG~OOH*3#mEh$5YL./X\20q݅ga]m%!{L}~'Պ-:ZaɼlpH0vj5F rH# be}0຦0+@.eLDe5q [xe(%φ8L7i]oї> }X}7ɛsXޗV},~?G?Rj%MRF;+X["IAԠ`)va+!jwS (iY80N%9 cR0RREDXFdclo`4r 7odN yw>I\C;lUCAx[rHvH^)B6&x"\*L 1X/9BiG SߩFi?!2%bQp*b@pQ'.Ilt!PTi%44kx+14 2"CrФ R2(((+T1v7&ZD /8s~ȎCv=};'/͕T7+cYWuUrufy;ͅO+MqbÎMMs]}oR67pyɊ מznzPJꚓU2Le40!äIuJ^NU󛷯GU/Jxݮylͧ%kg)ʸ&u0:)if4{4+f?d8˞='3zPw[^~M.i8{ n:_VO޾}㛗-SF$jarJq`|`FZps7$'*e;aG%FRM`<먽#d\WP0Y7 sv]# %Yd ]Y]eq}![2PiWn^"AOME)QT2U>(IH:"h+"+B]ĔK:5~?dkiSslA ˟bXU-*-8qvՇQ>]X6чT 4*QVr21Ύò,W*^q2 iHjt8sgO`~{*$Yٱs%=z=՞¹EZgt>}g֜+Cݖ] ne}TPYxx%+g]؇56tY&93#cMMΈ׹%3d3& , c*1?yQ IݠV G:LOŤ$QQnc8M(Fi0('wX$9*8!Dm965 bOzϜNB+(1['A]lun^W,<"JafWV]X >Ꮂ?~:U,}q8Nzco-`UP8 Y3RbE^j)كBA7C´Q1r sp8H!rEXc{,._j'/vVt E=p_r-i=F6Xn{W,R/2AkLXcbPǯ7y+&l2)Ih"grʹ %{r3mx9:/Φ^|%_\篴#s&%Q3O#qMc`R2b]Di "=HӊS}[!rrcKZE (ݦDqOxpF Q9gYP:0V#C^27(4]<'az1ZcoeaPJ\7IuQ7DuBكB?O5SLzkC"$gQOsA!X;ֆ-6gmP;ڠw'UFl@!05&Sk=> (T 0 /q4d^Y+|gJY8`1waʽ3ø)amYgPw)JnCjss݉1_ڦ)i\o]傾o`yzz]~6@Z Z^1^} * xWǩ3,VIO >=kz+^F OXܓp䵙^D>z·˻ ~?cJhv1wQm?yy7i)ڮ5ݤd SdQ,YX`.yVęo2{+ᓟ* *8V PPlVU溥Qv[~Ml2X PieCc,+/ޫJ>uK1E]R7UŸثUH 2e0,p-^g\'3նBKIMReL aek#bx sIA%f]@$\.%6!`Bi_ a*5*ء<: Go;Qki=̴#|9O^COc >M\7Jg JK8[;px T4+jYVjluĭNnNKcT` 9a3ࣗJªqLR;mT Қ8;Aݨiu7Yvg+cLED[5ǞLDEfnlc\&ҟ|ϥbh$avT$[`h1\KJ.t`9yu4\p-=dȆ662Tݏ}"xCnf(Wg:84R8̨aזq!3'LDL llRIٹ;qm$_R2n͉\QxnVI**k7Ӧt?ҭ;˵+D[ce`QE[ Yed2M5@/ɔmR ύu8| vBwDŽ H؁FǤ"9AfXT)O'L 3"APfd];!7'aγ|3,+:hQjd- &P/8hR\#c8$9+*e7dw]s~G p`@0 Ҕ"9<r8`\*HNM]Zx F+#= 8HKR{N8#mHxD8_5-v~w𯙧!X"~pf/gt1IA9K0͆_!qM='Ji2LFƅg{8oaZ<2(bGXQK~4H~Ic 9e`2!L)($۳˽^8yUV Ȋ=q0N!rqj& licq4띷CL6_ pLH^=6֔DjJ1rjAݚRZ=q X+Na &jQv힪YY=M,y]?.]b+fp6,CbotQEV\Tgy,\t _zy_o~y}1&^~`akAYOw@ߥn4^L5]XY}7SS({wٟo1+>R|2ק0MIK4vhC,m>j.mrC?/ }B`̀@kinq_:.;lN=vrϱJY`"B+U^cϣ ~R>Jl(:۽;j]r:w5ΆsMTa9%j0 ,R)s.)^`9BN+uv#1k5lΟ#SXՅQibIۨHeL!P6NmWru2}Tqw^c?u]?otںG3ܣmG}n+G]!E<ۑz#"K }v6}X9h7>O4LjBd%)'M ]2{S˵k32y(rSYeoUp)'_*vpt2$k=};>8^۪bÛF'X6S3% f{taK6펣>1|PCZ\i~ZIZM1Bѩ@`8wj_ Vw=Meۉ/Uzsv<֙wlh~Y?xkǿ!xb˰.>%Mۗ4,+oK{c).zU 瓣vB*ut2ED~H`7,uFYe,-؟j{łe ؊R#.DDImY+BYZ$56W{ݯ,NK|LH]_|3 tmYXw="^&I=6,u5 cgѨc^L)>R yԠ*)/3cpR u_!m[$K-:[!Y*Z&q-x Nqs[h\6&T39e.R. =]c蟤]zis-YUWSa{񱗘/$1EM?" 6 x\u/{DAh{#CI]G$HVB!ՀK+t5v(7B잮^]1vXpEw\dchjt{uj)gWˮ@˲t5P++RUvdwWw&Z[OW'XcvpΨt5PWHWA#.%`ׅju[OW7{zt%Fbsì?׷ MyZm:~7V6[xy1M e0෿ӓVg,5 Ƿ+rvxjYt\8_+~9o` XV00{Jc|xƬ2 DTh:;tԟ֮蟹].rqt}/Ux-w-[;eͮuokTGrOm;^nޣqh|vrYAnC F Ѵ}`zKIR9<̷{|sDc<؛8u犨\sS+gV~%:Rbk]|I9H/Fꦚ`k¸!>,ܬYnQ*Dr-3~Bxyz^k~Hy`>bUf#G%Gǒ ,9fC5.ې7-{Ɇb Z ~9|xCGdG:|D)Qn`r+e$xtytp k(y,q%YeеohLj*zEKxٷXYxCdaQ9tIѰ֞{VC*X@%Ԏ]SPPUʧQ4J)j%vx X#`j [2TF0 8 m\;H\tIK\|#X#d*zߴB Avg ߁VBlTzC[ dܠ !G)$vIKT,|<u0t|Dm " ` 5Rt8dR0 3H5HBh2>*@A.:"n p #Yi5״Qi*31Pq 92b,Xd50E{F5CRl i ;TUv&RL.Vv/ J Ho[!M=#yx^n1*$$2 "42+ʻ%&Dٱ F.aةM1>dȏM5!dM23c;̠5^̰KSNFƘ@Q)CE"NPr]eg.j#P"T݇Z|a*tڦDZ;G1*4Fhe:YRHB1yZ#T2lx蘊' `Y|Ѣ`BgE՜  DL 8N`1Ӑ&Ȓݗ]d6yn}[ q;"CpnNUS QߙD c xfېMRyY+#1 K.};;s2, Sе 2":qwPVRVoyh$!yr ЗJ׳ܑ@R@"T$ePy8g!nJJh B;ڱ,"aR(.R3X-*/9!n`-aVz&F΃pFDƤ td&ny6(I@D3&Js( ?AjD# Ȉ(Zu<\\TŘm7 }0,,!*(Z=Zkሿ@v̐.ڳ64M2df% m;#R-jx/+},ߢn$ƫr#Gr_bPЅ-)ºo,Y'!.JpݝJvnvfHCY]%V3א}A|p:h@C+K+0ڹ;{.WYq[q k^ӮVz\In|Yi C/ t t$LC ]ô`&am¨Cm ݚ ߵ!J(sHs K"AoW9fI;z~sx4l|AÌUd=ࡄKXrh[屨`5|`U%  y3z!V i1% %ĉrSA ROQC`P.`$*a>@ixX!X7ͫ(5Xf4DP%Y $UK!H ` R,=W9<- Vi`/^nY~S|a@zg<.8!3j(;\.VgqjӪx VBf"δ/ a ):ݻ[/UkX BոBLxBww^eu%;]nVؾ+m^~9D4uq/{Js~߮C<(%3x /?4%UH ҹZ*ތJw4\hӲ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\]p;W1 rdU/Aq W*;᪟JJ0\i6\m+6\p4 ^h7q^kpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ1\|{@pe$t W 2+PM+#aֲj+' epņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ WlzUpƒpS 6 YX?p5EU26\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ WlbpņA W_: 9xϳwK<ыn\ uð-|0+q b'J"T EI PaS}62Bl ;:vV5\Sq5A\hhA;_5 POW2:qבR2+ \ژe! ɸ HBBѓEk+Tڳt\JǸ"`pdpr_;^W֊QdpyK)Ah2BL2jULWRGq~[E~iph~=C\ 9ɶW9)3K͟!wj zg;62q&ܥ4F̥;on{J Ca:G& F1yLJ8 "RvBUd#G&+T8f. nfG=]?n$ڨU^Ē`WquKMpdpr+TcB-2+= pTdpr+T}B2&+@p P" ZcS NWF =!\`/ @W6vz:P ʪImDW()*B+w5E\94pRU PI~U:޻";ImYę̅V}mc{ vNriPEPe80mtl7W E3ufޮ %LCGmltrl ?e7|VWj*<]k]~о,t^&^/zL>oq__"Sh=?%>u3_noO0|wnY4ܰ'o/b,"[ZeF*ޚ A嶎Xn}cUyҌ"}X)q!2u<,"*1i[#_g_2oMgU2+LX~8in͘<7]h?!Kd 9kX QE ig&tzu]?4f_eAɂ-Y}?eZi!zĸ]!g4̔"fL\$QO`;B)vG82)6ʥjK=F3O1KX%S|(8D2j9ŇjMbP+q\x3/᪟`7~rȖ^j0OejZl\YձC~КPpdprTpjuBN0&+emuR2B*P:u\J/WĕQ;J 6ޑʍdAPkOW2+9kSqu\dtpT t@dPV UVqu\y\+d@Ѥ+P$oOWxF8]*$]J)&tvowkJ@pt0riTŀ*cLOQ#$MKi  ߟ|giܫ(?U;J[ Q|3yc_g?l E˝ )D퇿]ξnSyj̾;P\Ϊ`YR峩쀫zv}V w]<}Pjwj/-#.gr3;Lo^`.xSߪR;r?ŋpO ]=[YK~ZZe޷S|1SE dr҇vP4h34o/1t>C69^ 1hNx걶};_ю 礖3jIeƘ%l_{٣t!V9aWY^GZU_AuMuف,%z,^+fC'/5vZ3]cץ>Yv;vրu#¬CY%ۺgE|\Vtj|^>=>bvlhr > 6u듆K]}^|[G'}c'쿆_?Un$q.1D!=PJZ,94TdOϲ' dN11k|OF؛^s\DФJن\s5@vHOh &G1p#W 6=;Ti݆ފ&76z 稜,zubvRt=U&v<֊,꘡» { 厼gOm.u\JWĕ\)BBѐ#'=j:P JKƿ$\\ ?{ƍnʠh! ]ًMdE3V#KFr=mIq'A]gCǧBNWRV]=FbX< &1t~ǟDW -WNW lǮ#]q0m3+Jp5n ]ZΞJZzt%UH5`ќW'Z{ J"[zt% 8+At+D"Ԙ=]%UW&tP6љ9tו- {Ivt`7ؼny<-Sddd? ۨ޼,# &һZSsڲ͏uMr;C7ʯʡӃp C!fH ^ wE%K.SyE>,4tfWL2siߗ Yt>\A6pi`p/lk(_tV}U7 ? V?u+[1X-;[M2n͞UK|EWX/ZL|/!gU)%jW 3\Dҹ6t"8mκ'Qz.9[܌z9Z ?J_d*V*W ovw7_ՕBٳ+  f3~Rg%yI4@)Mn.lS>:2d2*ȍ qWKF;ߓN'O0*RD~uͪHwWeЦJo>6ބIv =+)RmҌ|-H -PK*ʖ%Zt6>ͯ5H9ٹBB,[JL.0"~ VJ}MPxjh[Ύw+}!0 GP,Q^HhwH)/m,!OT8XH>I O 1%0l٪pN 疮 -1 b%)I{KQz.xv" 7x4 88HYraj3jX2b=6pjj4amp4UOAIPF3- Z<42EeEASĨ[`Y4.\w̾m[eݠg{7ݼ^NK@<IE9C;-ƝbT_9u.A@ Nt:z76>YјG Hs#=z2c|38HUbs>`ӿfAafb[̖N&* s>ix9)ץ{ B:*| fRM VNR,^t$E)Is9?fődL,1?Wdvl@u^]+,UIN֒TAU\fvOy`ٞ]YRW9-5M9٩ 8[>TYɇs❭#…l)&*J{=d^R֞Q4E:{7ȝЊ!4IBLrkvGb} zc}CBO*8=.v=4!d!2VJ0w7c,f֎g@ZE~Oijo|ɾN<#_`[F&._9FɧrMʧX ,&ERU50/g}BZԱ\ #[{y[fVaڑQzÌ_O/;j*\ձ!u.ImsɉыHG:y "z.pGbiCAMEl FhAC1H 1 τ3Zlv.>ۙzΫ9ڱ7M?j NؠӁ+ ka;M]T jm! b- jxT~8!,4I(J1,,VJ\ 687.q K'ϭKNFў#ΪQLqS7lv?9+߿L#4!6)ɤCj%UXJؚntהBHME(GkQH@S0hK#32x5sf"TX>Rfr e˅ʅk9iR[V/.n 7 ~+&_:c+( ^zǼIy`,H5E J=r614樌2Z* ^ 3 j-6N0+D{CJGt`ZʊIU3c~ƶÁ_q1φ벎cY[ںeV%x)O9cN3Ғ r8NcD1׷\@cc2(Cv\4( ('@8Gc}Ҙ68aϗ'b<F>dDeY|-#Mx GE/U<3(Ɓ ZAa <]O0BB2g($^2XobA(iP΁'MFbU53bmp7v-b1_tNcYɱkEbˋ,r4ʦ{N-7;Fk%@Ii@0!țυkċ8>| qyw#eOQEtrsoI _֗PxJko0\ Rv)w5і^pΑ"g4ks,F:Ʊ]HIF{O 8aG>FAkID(V|->t}ox2ۧ2g<չ7鎵~Bd"X'EYnf_f>p}ž$ \aVLe597ofH__^BhQX()]e.[2P>[عZ=[Uɓ )8si;_#=lfMըgʼOUgodfӃ߀g|fM \mc)h~o)ЍPnT^<*Kl.niSu=Kw kk{2əkjrF-(!Q8c (ʞHNhx.Sٷm9HA2L6xoˀqRcvQp9Xe0('wX$9*8!Dm965 d V1FC5}aKs+*M4/z,ٷ<`ٻ6$ew܏G 0N\ ,=X"R#A>DR$EQCidOtu_uWq k{z[.^OUȏ}{?zM z<`4]fY ?mᇕTÇ8:!yUÜ{JuT4g_4NʾzvS_wɨƤɹEaBqMuq綍w/aNl”.T WF9:Ũ~:ltlک[Jsn;b&I7)4}ڱw;͆\H1(G7XxoxKt[Y~IGƈȆH|PFU?LrJtBf.$xEc ە¸DVSZBk  x]@ b "XE\_jᕈ2{:a#!YL09ɬ\VJ ˫47zfqaǪ6K3+DH܌'uJ34cr^f0Tz5%d+;Tڀ-WmSmT˖XFQz@~@,1yv.ȔB`aeWI^f-E_-YgSק6ܽBK9i$LX,;P%ƤA E6+ŴRY.%.l RbGc Mt=YlQci"BI1_,5Fjjb1;|{QͷX&%ʾ٥w*m" -3*5ͮ/H2@.cA 1FDyky@M,h/c0,QhA}ai̜:t{קos8 Z[s:R)"Ȩxǣ6EKB6q0Rwrg!Zl?S3wv'Ge4!'#ztFrl>$HZ`ǚ Gӎ,= Yu ײtʼn]D+n@} 3ڔ3FLLq5(WB!<MtNT3i{&t-u sc`2-Xt yDrIKU<}d+?NtiRlw]i]\Q+I-2_9td"0m%D2]Ξp!$M`<:R2{Ug^2tԬ#B,$Ih\:JCAJdJ('cYd,7TJ6B~~oy|.U,NB ˆ7A9+t~KͧFF_4STI.x.z30;R\2_\ ֙' 4> %3.u["%(Cĸd-%ȍܟ}:_ x~ΰi5Kg岎cH|֛ˍss'j~x;ՍPಘϹ~3o.sAPGKzHD-bۢ`0H@Z0Q,Z ob~┦4Zv`izvorz ѯuɑW.O\g7A1c'xݶ j.Зn0oܳ4xӪ)k$;ZMo6?bgyM`XQÉmɇO_߳޿po`:{Mwh KkH(K_iʽ ~Ώxo8I1Z墚?'&{qJ/|/ymWo[.ۗi e 41m4ۄeާ n^kީk~RpA?.ϖcP|Y'S 8 YN}Fp% V5pMi74 B,a@cunF^5fM܉'$~KIF)ⶔUDfo#ӎ ~NI٘ Xx$mo{lpjMA<P،q˕# QO1[JQ nY mH^Vu:Jt>ѶԚ=6y$\?tIYԝIQV x&Tz^P)urƩ+tVLU>YWM*E&[ig 4K;F5fKZ+Id* JЂD.dLH㲍%$Xy`y\Wh"ɐ=5Q![bn8o}81sv HmrkdqKY[N|`Qpv dZ/c?6nQl`ءGBGѴHҘo'RHFASby}1v9)G_xV|l\NyG}t>}qdY`(M3Uj5U; 9׈$ل*80}A؜S!TґS\ܘ9{Zh8̛YǞ}-*9EM*Y6,ŎR1d\FѪo!#bIT@sWEcT"ǒ;R-;zZ)Ř2&4`ץ >B`# S:;JN%m`g`^4eE ],Ҡ!rcJKDʥ(x@)sn (ó~hL엾m*L)dLesY,(A I TeH}*.Ln Is`7!H)񔐤)JƥfF59l `DCoA(]15uu3ULĜ2n eמ fd2]VQ5!ۯ*vG!`<c^k NEd[*X.bR띏=u>O}{5'>ԅKT˱}w2&i'Q4 ^*)| #t1'U7}Ki}Jf7 oB)ټ(f,w~P-ueWgn* |H5+Dym )*d֗3hI)k,.O vqY-.bZ˖zp\Eo":Fs%Y'zgY令~]TGULJ2&;c;EdM|y[kr#דt$)NRkkr.`,]̮hTcuD6h3dn@љyg|0E8<L5^WW[{q91v?ѹe.%E51=H(*S-9K H+MXƷ^)Ԟz ̊e]OmĔâtbD٥w eJUDyv%Cjn$qnЌ*ǩHA{4隫bAኰ nJQszϞkn  ~w$߽>Mͥ%5{xqUG0pVζ" `΃,r %` x!﷙m1# 02} Xpp "}C> WՕ:S1`N!WPԕVkP)0ɨ+:zzۺr!>sF-#ܨ9FuP 4(u3ʂQW\BQW@-wue:g(R{Su+uee$uj}WWJzaBH]1ө'ueȥ+C޺*y#9+ J]1BRWL?Π!+9+%Q0bN2)^OW''9m}<ʮ.쁸;>ww~OJ{:Ky駹Po0`? dYΪz\ ˜P%C5JǶ/a++aȮz6Lt37wmɴbAoD~ӣq:]lIs -9"Ԟߴ+bD1-):/rZ^͊<=(IL"M:,@OΊt('S53 sU\ikYj-?,RaZ^Iꯣ-h_3(Uy0ziU"ef˅N)PQ*J5_5EKO'!o4oA׵wo4lj?':e x|MdV}FN~5->NӦwUe<ʯ*RS;:A% .7u>۩,;~6r$<;Wg_̘Q6u 0 Q{uRDR~b7P#}u)9=Y^0—K)MR.;x 4vf>?8? BG{q޶zϴ!G ;MfR3(dƼqxb;GrsM-$F*Nŷl8E:ObrYo$ 3=T:;XNiŸ$ƱH0K ۢ$UKӞR]O6ʎRk=Ȏ*ZHJ _WٹuTZn%; dk$VkMW%0xu.f=ac#EsOb dwP pŨf3`y5i^f ;﯒5"U{$>nqacS` BDRN[@=`%B{-3;kjmmd%WYbR⸚w>ow"2=^dJR$M2G_*Y扚84X,j'}tRD\ ZYYV@YQHLQ/MBEf\8P}ڣhuɛ>Ŀ6M[̿'VkCj?өRi]' ͓yi_Lp~v31f}!Ke@bu ,y"4.qQo~~a7&b dk6>UK&Vu0.uGS֬`KP:0Ɉm$Ic-[j(8jT`Zm,\7܂_,:Ϙ?.P=^cޞ ]5~$Ǥt:ZQ6Mт#QN+a҈Q!wI}] cI8m!N\V-ke!QC%A;IQhe֠ bN%iVXA\HIM ڞM .IudC@ˊĝg' p0'D/b$BD۷sS 16\h پڞ,wP`NLv.lkntmٲӭ1A4EL夬+cP1n|i(iL% wYPA2f21.w\\[=%G5>p@kJXpP7 uRP1Po46N&5-죉v^՜dSvND v9҅;Uuk@xn9<8vU_c]$c#ã[NO8CV +Y9IYܱ 1avdƻҕ`JA8I(e0Y< 9,G$Om료c?V78Ibt$6ƗPn})i2vO^0ʂqLWG6NpJQ0~ǰy~44S p{0 B!گn =.'Uhܓ[ {(*QܔuBs<[Iezmqa6{=āx#_B <0&E4gY\h\ʘh7c` ٬iE 7!T;8kKVd~V4[t2:6Rߏ}}r$[dc6#&:wb"M(D^(Vʒf QjU.R5/jA@cU^l#[(щ{8է=V_ tg-=ww2'EkζZ7h=XLJLIQ7i2Y7qk-(\ TJ_z~gܐ^7E`NEޮ{RtMިBi+iVSxJ_NJ\ (|ՠZWoSk"XVQY(A Mjʔ+2䈈 4.DHޭ{48#W^%Q|>Jlԝ~8a~{SMo>gjR|9[̲&WFiNV?h7c$,qr(Y]qJNOW%\3yi}vyu<"zpڲOC'ylhe&ũI:?{|TCd{5;XҒIJ(-o?s5gšC1Å+\@JO~xw n!Xˮ.,l;cmA-x pU6é` vN3 fXS[#+ӻ1#H04ז&JJ7U,Jl'5Tt(Ŕ6vAmӧ6 GՉ‚I\(唂Y2 ҶsSL(nҫtmV>Pxqq:V>O{TXsK{`9k/s?{WHJN3=ȹ507SڲS3˿~e"Z^$}m;UMcbr4#씔J`XŚG*YX8z!@A\-orWbuK_ÍݘQj ny}ɘC+y@qI78WpK,/\^s0nyq5VmZ{4D㋶:1C5ۮcօ{]h/ȒtѴI+Ck.XoshK^NǿLGan}[Tr%;3u ,|C~1q=?Ĉģ$E_^ṕ.nTWx`IY-rӺMߣSHhHoN 0ԭew5&;{IT P(2OeZێn_dUjK- If;\J c6Ysdk0DBjų9&۽q؏IRͳȪOھ@RX怞'>CZt{#:M5"` uჀ _k}`̋ʉnŴ,pd<ˮ:xVm`N2נpY8s`kImc1ﴱƤC#Ap (AftX`ѫjUkm6ӯNLi{=,хd`yzی8' km 0RHb̅'6J9:7y.6I.Yb gXZ5 09zN^)8$ stƠ|h6 0~{'i裟Ucz#@2Q:G{!'WV}.y#MqKY0c_߲y.=:dw4:-}4[q4K2c E1i.4҄抢}}k7e6SC6& DB('jT ],tj' t4G<&m@cې}1Uw8Ng Gk RDE̵RG3 l Yi"8 ߥ=FIURp],3v$ܤ$ +0PKe9VXZg{ӳ ]h#Q\d(ϢKyfg-XX#YO(rE8S6 ]V^{Țn^횕sK(ƺ1kqbiyQ&^- lfۯ!6X^ZiЖ9 ϣd  s}p|hYhݒTxߺqh}:ۉD>fZq?4}wu_Uל(; \;ERiqo<4 Q̜΋b}۩M[3y9cܡ( Ԗ9G| S̻]Qq{2UqOr|<\Pk*!}|s'C^~pW^4Bv\=U3C6ޓ;A2]ROdY="(fuxԘ   a S)$p.f)%Ҟ\cnd wPU0 ڒkK:{uYkʄ nHTPv] 1ERCWYU.qKDnMK4~vB0V-236U8*DYfNfz@S:V/6n}RuNkfaܠӇ]"g8'ì@'j7o֓ e2>ØL0Iҙ$Ֆg4XPHb{r@3zVGn6Jk;(v3 $Qv&GƵVaח]_ [~tjӳSOFi5wB.y;V*PP$ջBZf '( IBRfJ? ) WCqI!|1'4s+s5+8P LPzA$PS\JHn7Y$q?rC ՊpYЬ'ollı}υRpg;|ytwJ]'=eOEDž;2r{hU$_BVo}'ح q~ =\ƉDjttDb{؁aAU)Zz%0%) /) }J I:KWD%kI"dJzac2}"-{T5{5c0aqCz~b*&3ڲ[1Š9wS67%]ͮʱ @OeRuep#?7,fQԾ-FyLעf *D5D@Gv\SB;0v?Q{;ð77ÿ f&>ٍtۮkrge@8;L }t(fIOuCOʞ4@a@u({bxMN*F{:xuCBiᵻS1}>J}ǷASQq"\Z!}BRiNkc\qK!# D{ڪck AU`Zji֒[?C* SƊIZXvZ7`A^@/ݙޤ<q(X(浞u-%Vo෤$,*9lB~ĺG[YfyhtJ*F<E\*SFThs27+u*c3^& FBCecttVs4v1×@sWQ"zu 4y qfmȹրk}|+Y\`󢁑tCۓ 3N7߿d7[魖9{O5Lx Sz'ü8|YcZ X(O,(.2g#1$l622m|d/{ކ5 Y [lJV:yWVF_pA'J?i_B i)+fUeʬa-sHPj>4T=JF,wr`n\Fj/[ExTUJ+9Ob*v$eh|(BOQ/.VM,fl JSV&=b„ vpaL7fQTG>,`XBTy$4á[|"]84inkzHDm&ajTGɁA5 ʌi%*tiQh;Hys`)-&<?eX$\H83lbXp؅-Mρ\/rtRݙQ1RP\Ν6'wD̩sYy@qcVp\Jvkin\WŕM6#]*ueRunNe˲,ɏq*=MK%P (YkK>n~@|UTn+L( 0uWw]t!W&ϕ. ܙfڢS1B(.tBlj51J;]jԉĎoVf:$<d;#W2(~?vD9 i,&떖MK+a>R1m;a$ы^c\q/SW&"Cob ][ay)KSDc; 6W`+Mր.y-t|1Lke(f tSj8O}DR#Dq; ~,MLm$;ժ:Xi%Mn,6kIr3Ryi2v\xߝˣDLZӸmcBBe1qrW}kMB!"/Z%c.]2eY194%(_2 }yBu3ޗ*1AMOdfiJ83B>qj@x~`P/ɳ}JJXm5e:,'I;L€Zܓx Xk-Z `7pъLKT D ~u}^ j?;`(r;ɨt=.M4g˩a1"HEEOo1r*;t<{. U$+R4tv4 \"Y!dXO9;n1Z6de$=dF]xmixmӀZdb&PFreIiAߡj>ʭ UؙqJ[ 1L0S0PJRpǫnз-7!𝈵 Z1?hD¯B+uo*,#VJELﯲܑd6ԯȃoU:=yW/ l[h0)Gӏ5|~AnwG֚T̼ᢔ4z}eD 8 4o.}BM JB/Է Ugܤ>ۧq. h|8MTc 'a>lӱ|ÿo!Xl0.HIP?0ƽ-Z(Ơ$ÿgC='0i&lo1iZ4d-̿/O[JhcVٺc9c V'Mu).`S)7[m0zwTZ3|,Z!tJ43,7B~Sv,Smj15d-7fS: Bry:mA+49ieaa+젴=L9MȂ8HѨFk3ۘ>^|aia;ph f}eE͠;mͩO8v53"rw6":IqZ}QZVXO΃NU]/8i9[րU=ȃqtz)gT,7ut-Dq*22.-g{@F]6\XXS&-E{+ ,0+A~M֬@/Bt\[/>zрdhxji;2c_ *_"FHJ^ݽۮd2wbv!2l-E/[DJPj .]=ĚΖKvj=g-v߬Z0XXhJ+3~T= g.cQuYQ =z=G:b|?LE6d鸘NϟdsP#-IcC~j)E ]1dy1ҲcMFŪ Ơ1hh7y_5Z&6{0"0T+:o)g[>GS}v %z"+14cBm'e^Y B 0+Xz7Epd}Wtt2yw ${Z_9+Wc2:`\ZEiUߝIۻ'zիVU7U"U5-ES$S[ukrJxvn-Ȱ])Zi Ƶ=0ޝ:04ps$6 R{ŘLedt%x^ *Q Rq$ns*u46^WҲvl7;>m܌"DQBw>iIsSբ6gJCd^hor_{@MZޚ s-v|sSXhXkAD#ҠuEgz|zvKBBtm L-i\P $LKaaۍ4>.n+RJ2RZ֋J;.&VMObƕXj"\8ByN(Tٶ2^8.av:HY*|dj֮I YA(ǣgzݍ]8zB" :P宰\n7^xGom; W:m%yȟBi˭M W68rp9- Ksn8DReVNP5顨|4`) y a7WZ@zuRK,qY!>nV} {*%"@שݒl`DQc2^]=0};+kr]C,n۶kL ps~tQj+AyL[ ES2eyqp'U q4,~[/Nq{~g7͖HCq^܀} #Jʞ&%ZN$nTOʫOgQ#AkCXeKz$i?hxEQ?Mzo[pJt %UPrY%UPrYJȧ{8"gQt$ܶ6j*A aϔT\xπyJyJ?f*cRE,3J\1GtHEF Wh^̯b裉ҭ'K7uĦ t\0Rt!"gHh vl:\JIUj-+E]쭽c'Ȩ!ŕ-?fƌv\k:D5iPq_53\1m? c $K6<Ye&EF)Dd"/JnXBQAď_ݲ1ձ/a6ތs7e9Mix*ؓ1*8 S40 :\&Q&;h.9i`?{/*˪^E_7{[$qSZ>X i "4w23h[XBQf/ [KH S2ȭg̡**,,rw7 ƃ) uΌ<,/ PJZ~:_pЂ|_9Q}ZJThR0p"*lNsQ8WSYqd_-OB5nӡmԈ^1}uİ겴<3d?e=C_ϯuzWxyGoń[r@" !zo-fPWW9.O $4֛o*6铉IM~88U򐰠W ]3*0Q7 V ̄/4IYeV䕏] /^!ujZti]Pa6ȫ:)%`K 򬴥fv|W4U=d4zG=e[2,\3tI][oH+_^a&.diXQ7)JiYE2Šݾů;J) e|h &<#iM_?]eZıIR}7qQy DgSF\8a$#;P"MO$dԦ/IJW2JrKřB-yzClR36|64x%E eDzͶi+ Gkg+0b2UH(ATFqB5 hrߒ:Zr|̊L>Ks԰ñ5€~éD*Q5$/?/4F$1yH+6TgaAJ֐o2zeKU 럜$Ѩt0fh:3N%;d;nL2"j.慬~ZٔOm>0-0i[(ifX$eg r%\pY(1qC+qmhɶߗ*c ˱VLW<E{rQ^T%'m(Za,kH8>]zf@xlp}Ԑ c,w.ܧ,$^/7THlKd8& vY>p$F`ΫۭD v114NW\V_ 9!k.fM1/ !t,DNͻ 8%2N f]EL_q՞i2!H6TQf%2^> UY%9mi:VRV&x9Xov3ҭ0/[d^&ZE9^|E%2K]n_oIfHHQG<+Qj"УNz%erFpwO!pXb]D˨)]}hgnK.=O9/}e' mj#<i%3n7k*\OK&U8N!FbM/Le-YKsR~6%ug~WZSATc8Gd޿Xf&֩|xOOrd? /EE^,y87<,-}='VA O_B&hq#Of9j?u\狠875wXG̓ۦqһLeg)ozM)U-6nxz"0W^GX,I>z )9{բY )|P=e»x~Ex+;ixtD!u7 q}n6uͺcF;@NgT!5a'$'Q}UWAT/d#/GQTa*0Pӓ5\ ["fuYWKd=ߨK+bay%wы顰詑=BJSdaa> ,F\W04r8VHalakQ/]d<ț&C%\ j*wIEo6ryU³Sl~[*FCQ:*ck4p`/5I"O .I@:S儇ulgAԾ7i_/6djPFa@Ds ۺIAvAy+Iq@lVpIa6]! iÛN Cвzo7 h))Xv^&VKU(dؕ;5 cDwH9 vU#?%l K0s'86 K\e ~ʵ1-Yy=Á!izcS YdUrYDTG@SP$(a(ȦRZk-wx/\(60JUw0>k5’D͸Vh:b6&D)`)*U8VʔIx 1hXۖ}Uqz .M+[|ҫߡp$Ųfa.X#DY ]8]Q06/ #:-Tw^5huIzUXbyM8g!y~Xp#mF5`sڡ{g/UE ҵOlM֌a.Z0,>#lA BSih4\2"j!|>ĠPRP*C"öҐкd J߾8 cW`ӫ -wW%wu?5~a>ݾܝ4lS$0EaO]F6 %G? !,giJƯw ^Ladc cqa7bzb ubc-LV0 >OF*b}/'E6a Gv=§#&_RLP Nқvyd7­Zڲn~fvg0? #m0ΫñDF?;)EK\pz)'B̻OLٿJe~Wq*`P69_kkpܞzH_Է Q.L^cWG{v{]Ҧ(\\'̋DFHPx1~Ƚ(Jd.*5E9i;[qU̼<_*є0X ~;7fHƑKJf-gttPv"ViPH 4hO\1̎GuOX1=Z+!;b$]E>D&޿J'Kob] 1``VKTwveߎ ͦn$%='ap% Y8U8r/'R!TY$B.%Gp`xJ*0`25',H#vMa  G `D [& ϟ׿ơ,_- OXG1~~w/ϓBNwcXg-~?y[.Me. E`[;&"]v,~$6idŗfu>-;C+y%O6k>L4&%x\oâk {yn4ΒWIƩ#+Xcg5 t)aȺ y>=w8du٭АlOΧڰ9YBb4΋(0ad bLP >e;G. %KZL$nk97>pBeSzӣ†_,jFS/SY^`/]~%c0U&"Ŏu0 !Y)αy`GiA-BL2-[yXy.u0Й٩HCqD0G0LtO 4jMI!(>Ex|'6d|`û4HMG6&X/s.ɑ\/ӅW8c2|$¦tk5X1pIPBđ$x颍SBh:Ff n7U~}LarC ezIzų7jlJkA*t3˨_uA۴."ǭB?da5;X j; Ec"_vTIX]b1#m=u`H)U wԪ{m‌a_RaWиDiČ&E M{AC{=W\g!m$DE^,ngvP1}ܣQ2,a8PBB]< aI*sm{&c>.WҰ"GU܁jc@ﰨZU ~0;0tt6pS;W¦Ub\T?1i!]P&M7.߫o4x/f7*9{-BiKkaa`a|EUӚP+yO\%i}ͤ[ȗ 9ř#<;R!ղ؃5ӻx IYS%\I)BuӻjNϿTq2 A$ sFHZnT+|lbݍG祢+ly*x\2a(s[t#ɗނEӏAS6ڛl6OPRd(ss&k.xF@ 3 (Fa54hxh$hHFb5C)gC $o J{SU8ނ$s 剐|85 <`!!"dHL.pqEHpe9cD( c+ryl}*Y!5M-Nh$k`Q lVC7( ۓ$?RQ"|`52 6'f 8V<ImJZ|IS}80XnЄIs_\M`M4[]qY էe=ƹrWV GκQC8M8$9&"0xSZKtBVi0D ;J{oY%8ēCOgWZ M;sT4ܔP OBjK׵D8xp;PaPWnӊV8u{¡Ǭ4ʓ m|ѐjMBG0)dm,{"aO^ir~䉏6ſ{62]]u=umoH/1^h6*,^G XϒS ya`!6}!0r^/Nciof&|Hxʎ}aqn E8o&-Km^Ѣ|_/QN8'n}fQAkۛ1:1_EV> )v)%I)$Ioi`ETEuVL+z ZnմhŸ>rvg-Y_iYhpW.߽ 60sCQk1  D@{MLwv Y\6d"$gOu5>kڵvW6L* ~@jo&;Ct>v@n0u2}c(uc̡X+͌ZBõ_ -NKyܲjvrOyyCSTt7hݵNJY:,*3jɢ%qt1.7CF.޵|7)u%\F6%AG%m$Ūgy&\7ⰘS$"F! ,DI$~>N8\Uz濿m'.(5gA}ԠWMt6X63i0z S ŘCƷ|sUt. S*B+}8FBˊl ox~Cws qSԔ<Ѵ)+Pّ:m%!xLTLyiYU;:Hs^|.k].} >H`^{eǾ%釋Wz~rM%:>E/n}MoM ;ЋFVo` Lly|88f<4WyHPo4,OCwE JKF8xu$O&rIٜN@-ĹQn!~6Ad٬_Ҹ*1!SY.cFT+SVNYGξ ǹZ`;PJ+)Д|4q6wh*=1Kja_6#p7itc(^ÎC/]x0fCn6O}-rWS_nkېbA+NYR %w"cOr "V*a5(y cYF*iP˽caY3煗NxaFى|z?ynW?ơ5϶pOL< qry9RAFMzMO)2FKiY@ꔇ$ǘP\ǞkɨZ\`?=L&0[=Ơ(gT%gMGCcP89fAˊP@I1nK1nK|c̘P Rs98\C˚긪gNbPxF*I3N6 B}vvjRMvq`5JY&oB޷/yY'roGEly#8l%M>$HT!%/QgDks껽U *N`bTMYWofw% q9>3On]WgRoh'L+jRC㍵^1D1u*k31ww98r)nVC59N^/b8o>v N.9Vu54S~xGkZ_CN*' A<2n)D%Po· 彐 35 WqCfF wo(&{'^b CcM6lj*XM(߂M ؤU lؐ{cusXdFny(.Duu׳YyLQ~鿩 e߁G黃7H5:6FaiPdɮ& !r:;ohM :GbUA 2_ V!j$ #,~Ԗڏ3mSISŪ*bX(a4<̉$vXLYfEjoD)h1V-VXնw?`!^Fwz&Hg\kX׉ڠL3,{h w۟ s>ƉJu8}b&~p~NW\aky-Rc39ش9ty^ *ih#XUk~Y, !:Mg=ߏ~p~f3-3PufKM~Svn'yEny4)FqpA(1jm#0DsD6%L8Nh_P{g>pR 24Y-v5 7 C)F"qa54Lij8 RKR)<$Er#eևx t۲fWlLIf AqѼ4VP8uJG\FBqt I,T꬐H#b R6lM:ܢnVZR^T IR 8L2&"5qQöĴ㐈VLt!ڹUF =?:\ Ռ'6@JW5l&V"(a .a8 yo3Uܢ\sr-5Zs[Um X ~0ߘPb٧x'?|wAEfn00BI"kk+G>ȖR^}وݍ} B6mpӿ~Se.E]Sxms€ JUuN*}Tf)Սܾx'.4Y]L[ΤN܊'IM.f):1TJ$G?)yc=FIBWkXQDH?-kdaN"F5st/UK\_x19fѾ1/+cg6Ҏ6OEl19fgty]ookgmy ƌJ A>};#J0 lOH&wy'FQ9$+@SS +E2;K1$X<"E9XIgV[,Qi ϐIbzE*Ȇ[a==BaI8T(! CV<3ڙeLQDr+ri+PC*x}  wWO-ED .#SY*J@7SҮHdQTE-ɉ(Ab?0W~%NgJ] EXE`U.JL"^rfP=ٚQ|׾7; &IT؆mmWF{e&%ZNڐV0Fm5bD1ӉBIY~B)shJ*ߔ(XU¶ª-z[.ҝ$l*%)0T&F:W((2-M1j-NZktѻqؗԥWGZ}U&T*m'SVXm[e T舁rZr r@A}tYTp?n eE8X*,ڑMWkʫV&Rp 5{;eW7d i԰vL |{ .k~bEI?eAJF$,RH2Lbjr~% Xh䥴ѱDŹ%T/n/0:~n~Qk Uo ^˳ׯ^9;zi_:z?qE=J/.ًA y̓pNj% )"_oZ-7ej}yswtw"ts92}.>|\כI3t0dSJ##dJ7;*UQai6R&0)3'hb )dYQ$M*H?HG#t7[('%:-%]b@%nk8"U!R;e>r7OIC5B~>2 d|D{,7,z~e"dsQdѡdIg=js"m={ #0y5ABhۨCTgڃv0WRi c+? .r蠔Zs rCݒG MR͵Sm'4إKP BeqeٹPw?z T/s/M9ʾse8~?m"F ѭvn1+ TgPe+fNUnb@$0™no,skډn)@w -my M=Mb:5sQ3ybܵ9A훳4Conuq ?hzDŽn(lnvM5ssoy':!3kH2o؜`91C;T$hv'i$v2۰! x fhwà2_UmfLbFb|HMf\pirg\T,/-[wg\ɾE/ѻ~T)@Kwl!<7^9pvƥ3.nBd?Co!Dfhbau3 nv[*Åxvx:QNtd_%z?ys:zak_K5$S?.pڮQj=TBGU/'~==) !LP0 b=[JpDՐGPikGpd j8`KeG 頋(C3v L'"_kfz炭G]Hs=Y>T㦬e=?,r=Ѻ !=hݱj 1R͇+"~7srN0f͡mv]uJR 0Ꝣd0V@֦Ttr{?^хO,f!Tٜwg&^ܓ=$[P! a96'9zPp+  x@Hs'PS߃8AzK 09?1} ֿXMrD/BC~X!&ĩ7\ORS)FOFF;]tV;uO ɡi >I亁X)y)eoDEbWzSǨ Q,Q/cNe߸+ͯzPu |+;rH9j]x`Oքp(.⯕bqJ\$<:?U,j3+J cmn c可gR=9Xv-_Q̋T'.p mblFnM3:vߜ0"9Մ֗S$M St@@U Q1#-O뱒~ss緲a477rPDK׹N`ρFѿѾr'h|9ylU݌8qt3f9S)^2P=s:}TS&2ytY*[뒐$+2bO *05pCTs8Z)ST\0:5*V&OJXY~`ذ$K&ets4Ju\=WOʢ$٧ tF3R.t3v۞01c|OoY~+fe2fN9#\bYǯ` lʮ>5ڋ~,s¸LIn)J؍y]m!9Ÿ\XIIØȘS`Eٝ"A#ML,Q1AW@^K^zW ;Ů,Ԯ|]H_'_ + F.؟Mц0[k;QQ`CHHAϨ <=o8KJ\Y?6ncFk3Z,٩f2Sk//BIDbAߚ ,l) _p%KWyqc&#B0 YN2 \ wA:fXE2 8[0s RGqD8 < 2 9]ѪD.RN-#66D9gm }j]~>ӖqH',OP)ί~4_^ct0EW[|5w@/T%5Uon{'!39فN'ԳzGًpϹIAl뜘S6&qߒɧ 6yg;:5NָCb'{t*I\\.*<Ґ#rP,]ecۺ_QuZ@?Uijkq*3[gD5oCHTH^J؉-W F% psX;A@p#7CKzNsaИ|(4||L9)mIϙ4rrx sy&0=8L}?ɼeiq9PKX0f0dx͖Ak$Tމ51@"]w,y3މd}(*S.S 1}N NK'2k1.BdDH{î{1Ƭ@Aܦc ZJf9h, )kΈWˬ9L-W3?<M˃YaQ,qz2NI5J+0sRCL=R){=h짜M+<߭gZ'[|{!L[XY^ r䙨O\pSL9 `K8|AyԞܡִִ!@MmM2PjO"z}^+sB/:( K@ rZZ>覜gLbC Hn:Bul8P'E⭊xU=;a0|7{x Uo)uƪHHܰU>Uѷ'λ#UR/b=5ʜy;+?tvP.OɇO7Wn/FUSkށw_]l[e>|hCP~~Z`y/|l+iWez6F*޹4ބa!Ot wPN>=>CbIo[϶QrS_U:|Vr}l?鲝;Fϯ?rV|5=^m+ v}-1<$9P鮖 w{J0s o7rF$\r6GqRSvO\\y5X  yO8l8g 8HJu.{KJl=ylD{wV %s)'38(OzqWWn\Jj<cAzwҽRYqPĎy;;`mI~hdX?WZl ̲=960_cDjǵ `) }o5\y5R ލ8 zQ#ɳ3 Rmw̌ϭMga =SoW{&k&l6V6'~4atoɃ:cgcFhfAۮ5}JO^f_]=ƺz۫Hޥ+b|HI n\=#~?YU|'{Ð(!%h==[`0o<^+ [#w6[#f)'ke^U˻r}=|Z].λ,9zh1-(ԓ׉bb|>yUcy9Sgoe`=ӍUwG{$kZɏuNxRN>%OL_o3eC~/O^ک敨Mwφ쎲 {ox&yJ`FC[@3[;9oE-##CgH*T3HzQ[xK$O9x+v f[Ps"hY̘>XF?ghE!1v dM%4A64Tnsϩ_ `7C 5{cG6;[ϥ6fh7PWؤ㛡+>AE pl}sFh&[knJ F~g?klJA\} X/UG%[<\:lG,fy|Q>\m[pU~!L%6gu Q'1TFwl*&*{R>q4KE!2M,4{9)Z*6#fp].9S dLۚɢ1br>|>3*߭sor1ےNV(|ί;ҳO'? -* 3`8>#wLלSMc(D({EYw}{.oTMxKXmE5~{gWr]AHuƎPbyj9L =L'g{)TuV1Qx97T_V^@<6.W~:>uGϥ`mQ] _tm"C٩-ͳ28dk.NIOD5#^[(P)~0&r@ J'!۠IgDn;4Yy5ЄxU0 MV|L0܄,[rVl>7mGW!+?્bvjD{ -bMO;*5kHP0*%56Zfxi3ӊnʉtIߤJ\8Zot%H:!h9Uo9"D)3LB-=2Uir]^vJCE>CS _>ߤ檁ĤvO\V t 'S2l NP)VR@lt3ɵMSIpԫY5#]zEx2Vi`tmhZfMi˩jb6r5Rbaf _M8W;E=1ԓz5;qncёcw>6f0dx͖bB;5`UD` mA)5HJ;X- ZV;mn =Yy5nѹdk9L ֻ ]L0|uȗ&{g>ІjjPȚd.A$br٩i{HDcI_K6:.25Au=7=km{92W߇ZP7z I2Vd H 3LX,'+`Tjw4khJCs,r+{;;υAX/1/`ZBTZ_CF{=0*1kŰnK亳F$/ Ȳbhb[T(/JȾ RQܝ"ٹW+Ygc*!#L\4wciiMUqrTC8J2M뽨_aE؝_"IT7F,tL ؕJWMd摊yI9[ޮq9e'9Q{A72r y߭j -S;72`1uH.!TcOǨh&T+.4PU=װC_d ^1BIK]%UQ5I(-0PLjq:R{h1WQ])>D7Ms3j$|L3w&X%`hI&{A=]g/F)]/x}vzj;8SNxqrN6?{Gn{ؠM >{9lL8t-n=}+]%eUVɲn%$H>g5_R]]yW(%hov/s0 u[oZ]ߏ;̘IEWo?T,?^hED+ eȶUȠP'RuQś&(y:\:lk`Y|SڈX9|{*(, ӹSCICblZ&MRj,cc< z\ uڲoDf_:bM,A,X9Mds/K fI P;'SrL=UA8$z.k9OMT0Ⱥ_ؚee ϭژv2qwm x`2'0(00\}4n2X$lVpαQ1(f0ڱl֏ cEKXKvΥRr14F՚foMF6k](~~ZJIa"{[kU[$t:U&FTaj*zj(a#U%3 ,vQjl?8*p4@]<2#,Bg GʠU<#&ȁ bӕ5Pl"Xtm7wE5JlEv#eeuI1f+ 9AЭMj"N`ly++p z x5g7CSz,O秸~.&QD&weCL)5EbP^FJbUt)<8U*,ʏƮ,.,a$l=0Z\%z#`Zmވ:_֗FZƿMTSk>OAM/KH~hP"`㖘.։ɱGL''39φxC&4;MQ=C4lliZR#uiR!d2Ck-Eg*&6i{d(YCW"{t/Is,ȱ$0٠7%@ #/rdy+M8.ќ`YeӇ\zXsTJ/zt`\iՍYqf`)٩[gp{"t*'l &,DcJO/-Xǜ̘Z )Q&V9y6x Rְg.ILpEPt"!%nNyn4u^iISd$+H2bYfnrVD’dz=蠁w{&z7j,HhscU +JdhES! l{n2;6E3`'ƅdje?Hܗ!E\yAK)j@" ܚ؋Q )-iʳet1 `#\*g^^^s"PtNUi/#K_\2iϞcV[҅H~=S[e<9ǖEpцeU fAHe< Lهz1@?(8mTMkcYT]CpO:ݓ#"9+Us/>Fv%Y\`Mx%jpzP_9 EF~KV(\U{㤠Mb5-eW%-Gt޵^ٵblw;:|Tm s>u\?O=R:ncz{8Ɏn$:ofii b*yc-8g?h>cZ F@.Z?6@,ZkTt6Aeh-A1 bs1=/殀^"Dd`ƺn9n겼;ƝzDכ77lw}%CVLݓ;+[ND/D%.~e_ㅺsշ2Ӝ_-'_>M7$Z󧛷sm(/ c c^I\NI`TRlbK20ճ`r,`"ċ}Mm^Fll#CܱTó珏ss1rK4y!\wch6X؍I7 d6 .<Mu)[kRmbX N-ӝ 77F,][ {ίUCYt$Θ ^[~Ew+ȱ>YEс9A[R;׷@X8bihxapL#fG>Z{x£i{K?(ŋ+@.뱜yo]ӌ^#]Z+Hf5d5_~vSn|$&Ɉc(O)G\HT>V|L<V\+;rXxG0C*oUNz(c^MRsT I~;u9!=ŽufhK~Ru _7ĸQk/ wםފn[(B@#X!R ܉q`gR X7)d|m{kͻ&/guYkƖlJcw^]w܌B7hf @?9?)/oM7!q(؅\U_v>.rۣ5*vkzXS<( e] G i5;[Ȏ ܜ}xm|=*9~}ZErѨj)̚UrU)#PI ;ǗOJ~=\Bl0𔧍>ϙ+b.'N#Oz za`1͋ 8Åi\Q =LC?Գk<7 xR/UεŊ׊"b~"xwDLJI&k"''*aPK86>Ażʿc Tto5K_ogrJ$*((Qe"ٌNmyT}u\n|?nji)|pUÙ؅Ϛ;…ڑi$D[0~zz>} 1pZ+ +491-߂Է4A{_a5Q #̚6}ۏ'P f%d,⸠‘lB{a)\'e:~s"5)tq8@e /Bjn^8.E~~ J8Kڱd7u^Y r<.,mW}b>6N~=«{o+7bŗj* WH.(&*,5ܔ8h!V[Fc! "/M˩ZMMYm ;ǸYvR5u]uʥ.Q3;fUܳ~ϕ=Y*vt69øB~݌>9/߻"NlG{b1aa_^z c@lcy_jf9 Lqw8ǡX7/of1y6KS؁G$^3,:zPe b+biӺTV)X˶?0aYǚOf z]ԐiuZȖ]aZ6Jt*Rta.K#s$_>Ss:*1ֽH=y # 1P/OpA!"[\]P|xG4, O.4,>>WB(D,PE:"/x|PtsZVs>W[ĘICLTk¤s"nfёdc=JwѶTA*yI2ZCwy Z:dBSo)6("ڗ}* m#-z(v82w*"Lٚ F(G)w.]&V`ΎdnMy]{ylM! mD;o;Ani2kv19Y5 LVq="ˆ)Яsu[ Q%gf d'$syr@~U^->?Qa1۳pqϞO}M\2#Gb=oq UӖ ;pؽ T4iQk ^`~s8'~==mf{hUy,,I KMu5gs4Ҧ&8@ ݜQn873y7g SũhoU: N_(^N}/'񾗓x?,'qA>EQd}-@L[-UHPr R,ݰf}4{wNǕFπ9U j{JR9xogf̙e~:~Aט?_n+Ε?{?(Cx3?7#^7ۼzɾ,vkOgco|U*_{*_YP)\ZPwmIWF.]Ai3Ez") $EI$vHTuuUuUu"N`+vH+ hBJ(Pr}z{cT{cuHM*B#mfkoZX?r4}Nz?+,\y)$5mTZV'uo[2"EJbY'j5Wޏ ~%L0@9ޏj _'/ӾY]Iig`?ߡ">NLs<6-MU ^1K)y͗rd(kиܷDс^ i]YQ Xڎ)&fc r/Kx[KA_K}_h,5ˑ7,3ӶRrϭ϶ڊDɐ]loٱ 9LC a )$[DEV l?Uж@QqDQ[Iܺ ͫ$H* n g c'HHd;YC>C[wAj-*zU1Z3w6oXhhq> DN)]mrt#5E&HX8=ѩTp#[¥qHeT?al9MEJ`'a`2SLS( c}`+XYBeTdaLqVg7Rq>*xP?T$pLp%Q\с`wì-Lkٔ#T&~B%C?}9ʁS;$T*4IFFs'K+XFk[^/uʆ8dvk nDnpheD`d@ X8ꬋ&ʆvgcd4YNFRO' mfldL- Rx6vf=e%A|\bn#y Q'`k#IRyxb:9xJĨhGreN`5l mWѻ7{xUޛ @9y%c<ђ֛YQL>^7 %/&C zmX )KO>Lix)"tHnyʌO}{:O&p Dݓ4nƔ%&\{׍o_v*#kFRM i?_c&DM.~BfȘ䘘RÒI  G~%ozOXOϝYawO]PȘM>xFy7/}FrBb9%(@)Rr ,⺽5670//W|[)t6y7V#$*wy4RXVryBiab,h:WXi}.6 5դVپ7}j#Mwy~wҒ۟5Vki.A`SP)+ zfS7#ц>%^?8K3Nеʝ iWvϦ{Mknr=H`uh@*熱VT[[s[/869kWoy)7 G%H[kc: X#9"c*^y4 }vyv'+G粜0M!7N"cBs5>/> 2@}\GpqpV _ ]҅qQnٖ7AWCMz[ aUH-8ui|hD }<rrECjDu &$mEbt-9"K9jϻ[D`֮zBa$Y{fw$gv@űI<_!уߵִxη4P̵,P7z ѫZ=% jojK72;gsiU`ϑ]kmEI@W3^$a˫ITIbs :Uu6:Q")PںmsHR@wt%2phmVw{A7WhjC\Vȥ:"0c\5Kd{wmpg#R\H9m&raR@*Q*0VaoRW*綴Q{4Э?C590gX49Yf,$T4ӭ5Zӷ0-ќ6?_-A#H}ژF QJَ&eu>Zqꬉ@]$2)^_"El3zSVcY5?o˺o`w7wƨjn@}< ̥{vq9|M^˸T.8E)H=:!RKիWqKɎVKyse}j=%MR:Kn% ,ѰtMtilPrg'd8trPoBӵ(e9fm'`=]KQFIO>QX%|Q`ZpƘs']&wwΞ,?%y .>@@ҴQ쳼rrqy޽=1K\XsgɮVr1xዏhX3V9vGK Vl:{S@//.-(1#&xk(i_ojw_wV=A}b%Fk%#؂` r%*ަ\6nl@? tqޝ1Ow[NKDC#ɡr;CT`&|`8W(oU#BB(|Uwný‰su^>午01j{a_YވDx )H.:;~j oG͜Gp͡m1NeWS5]dM'KY+sD)[,Nm[Ox:;v4@;v-miG:ؑX{pDtsf`i͖ۋFn!o}ݫޏ?(#cd_ʀIt~FY{$g^З[Uq:(~u4(iöL\>]UȿhӕWf=8pZ8sӜkz~]/rwS/r1C8]6 IwV߰3e,ü`8BK Nbh-\5Q׺\*:"ǐ yMiE:Q:=Ed^WXJj / 乃 Y X*/ir-)Bٳ˚Ə`o 'I ҆%y\#0q]kcBb>Flr*AJߒvan6WE@#h/ C,%+_N,f;de87Jksd*;8\ƳE,MS&FbiS%2z%8jVrL8[,;$OU~x5f)GsF,M̬>0AL&ݹ`:PDhrC "` RjD-&0"y@ޕ$Q4f<;gfGOcʫڬi@(hh(i7LSUA"^[,Yc4 V9`J xexHn &A^ [2i^Ar ~2@q: D5b*% zuD5ϔTDVTK8faߚSl{ƶ2TɪuÛ7;2\>^g֋UYYt(u^,|a2q BP5*](gk KNL;bA^A~O&>|݋OMlP$9L>JsaIoӍ sS3% 2b?J5gvt(VLxj@j-Du69靅H% #jpg(Å<=Xjm s(4vNe[G+LyASA4!Ս:%ak\ |pp}ؽ W.?1⒣ K[ƳYT﹅7p.2)r)*r։xK.YhQ*9vo(+P?ދm$W2߿˧Ttc-13/0S)il;ЩSfj YXd[&I{qHf?`GY=H$N͋XW5#zq"m}ȋ&Az ]"Cun]P^LjrK.3iߍt5)+m.{<Yw;~\^=0m: )tI廦ךakj@nok"}E/ގO!H:#4n SnKѥhN>)Ғs?:~%<ڃ;҇]# )_\<5rwG[`ѩ-|?FLĽ8ՑCΚX TʹPGv %9)(''^!JDv4D=`1ͽ_ l(0WJbGr67F[j!I%8^X1iuA&ĥ?OVI)H-x$B8 ao$mx'>\Ĥݛ .ٰY6l [$l&*;F ƎL{䂹U(ҧLzs|joGC R 7c>gc{P\&vkgDjF!'kP{OIz[;$ Snb0p ơ\r!T&0R{ @l YOn^M ނ^rp[03㬖DS H1he%ŪWVjPFp5p'"BʆW)mMi f#."Z?Do\ SQrK@NE(\y#9ᖁ =5! 7n[{mrMwVkU1>Ɣԇ] l`{/3ܔb'f> &`ǴhҨ՜gjr)2ǦÇ#o|F3Q)Gz{ &G!( mx rL /C@6At*'Jl s Tϔ'5?,*ۂj 2KOQ{dIoԠm \e5W|9'%}3Hi#5IqPkdoS+?XBKh fcto+S% %i7g嗷[/z7Wʚ~?hcz7pg#)tȚg="Z,+=ǔ'⬯m#eN)>*=~?j /qpU#V΢t;g}2Vp׌:`3*L îe$ВYrT`2EK*iܻ%&W5Ubrk;rn#$whE~VS2WV=>##\LAL3 K*c9ݶyc-XK Im[n Frm7(iJ!au@Qc- )6N"gQHs8 )hмq 4S3!AD0g@y}a@c\˜e,òQ_f$ඍP6! !F!`#1{[\kC9;Bs@Q&|TZA 1ֻjmrvQBNDk1WD+JE3,Y%y3:o|u\{gRH`}VոwM:+Q>'RPGL )vфFE%!ŀ8'bw2oX֟*ڻڲ+w΄nI,ڸw ][?m鎝KT̛]YR"rB3df)XK@U['+WN I{=vly6?汩 wŽjw6-ku)̭0L8`\S'-gDxe"pl(S$GJ)^VjAo(vIQq.fAW."X&&so( `r0,e@+bYy9GIj=JMN^6:I"L?L>F2ЕPMrv3Wa}ձ@{Aۘ LAdz/| IbbTn; !@EbGDLV qKi ׻ 8Hfq'3E/T. UMͧ5I7-T.@O.N4G- 0J1Bl_7BLzS oKC8dxҮ` 8 }zzV9K5Pfś_ ̅d4ŗN,.If%rjrSw>@?_~0Cc[SŏΟO? x1~Q||gϻO c;enOǿ0BK@/I绂3⩟.|ؙ^w%wOy4~ MqqMi^C/N}:>]Fs_/~E,\|o@B"~0P$O׋1, ƿ 1=-ؚBmr̸U=_J}EϦ4l[8ᛵa>A]ΌCqPSFG'NAsn촎:[H|XN &aL**oD KQVUpGӉx ;@9˫'+1/b[|uLՓ;c72Oe~Y Ai!4%MO9a/yf$*F&Fb՚X[JsZٝRM̴/⟘-Ab+؇wffB71Mxs8Ts8p s[pEiI wIQ 9bjA2x#S2T7mб6ܗ%';[P/-Y7~μ\4fż_&NgR\p=8>ToP]LYJ)RqNr乓ZcEU)ST愕>%9%il5%06X! Ȁq,c1 \ʨq`r&4^9i@l*(IrQ*'fXVe8G }ѕ˔)_6%,$%#c%8 w&` L]h5nt[pȋtwqE"t<^hЕk 4R-X0ϖxDo=`"p̢)A 8杈#0cA;uGJSSd1_IQ [Ģ-K Z~EO Ov$ΖE L!6JP1}ćIʲ)@3:Q!c~&58|68]d č񚙡4n&5=&䐲#1e4^Ҥ]HiytO 6q )<yOfiN8TjhF[@=9\TlcEC؇ L\)B$mí&t)ukប%á[(=&FXl8G>EKzFQ]GVP!I d!i̶E ~Ή{m=9͌ QSߏ٣0[У- C JO6JO3Ҙ&G\6JOYAIQ5ytOoA%aqzɚkxR8F/OI9^>qC=q{^F)~3ZAb\D`#hB笁k#o$fM{!!!X:l@AP+@=oE)WջwӱMth=ݦU@kqsW4[xO/oK'܏UZN\VY&|D+r] o{{{q~ Ab!fh5b+FY3uE!-+E `,YA2`7QZ?BE% ehc=KD\>\b=ImJEnj _~ه.uU5wqt/կN^]\B//[;-tSq9gzI^Ab>_N ;+ {<ú6[ {y~P.Q_-WzyArԲbCaFF ܽqP;#V:gJ&Kfgj D0iٗ RU݋nH[w3 X/pu/ (UJޏ? \zОyJ ޝy-ˋL'㏅ХS.E|kB#1'29c`ÝtrcFl&4(DR _r[8HCY)o_:BI,0?s3w[qQR"> .–Ȑ![,V"[ "' -Qi <(,.'}$Ȯ'l1iNJֻ ϴp+ʹAJJ[@ S<~̔U*nQ(>c' {RŐ:fjyyjNe_tG+' Z;BU+6ܒS8W0\v<:,qb0S3*Ubש{sshgT :%Ap @;!KOxwE4IasϰY؜d-ln"ͺ=,PfI.&NP5}&:5=$s*Wg' >ˇY;zǬ^ߗͯ;mX\LONk0W }_ ẑb@?am]k%`V?]zuVAG+Wֺu5ar}z=C .XZ)jiy7n;w?f6s?o"U!$]J\  B* s^R%:z*hIaRZkUv;BC6}lѭJ8j t~}SicE (M!Il-PoqdC Xd`KI #|X3_(A?\ u<% /&q)p N %Ea F [ZTS,sxYlS-@r:f/് TYsmvOilWr/*u/Y_%?j(&q]5I~m8ݽhI$z<튒#oi'zRsGJjAb)nto^ԡ.0svS%|>xG$*HUEOz+%9a 2Payӫ>C>;Tsx(v9?]0wvhITt㌳jr⦈ފxbfѩrr=_FAeM2C4 SL=͈-)}F6^`" {'j.C4 S^Jh7ŀvA trFEnݲݺ@g,LJ0ZiéFuB@cc\l>9+.Ѿ "NMVrm&X(PBf4ъs ^ "MF"&\J*@} ;-YPRgS%!TS-1ؿ\(g()e/`!9Ds0e[TY؈-)}F6we*`-{ڭ |T˱z;m#1fD˃>v3h5طv^hv!9D08R1 L<}B Z( 5x#mF99lG#{/{uudZGuudOո?ȇu'TFaZC4Jj8Uerb/ߵlB[lM 7F(&$џ@o稴c!,#GgwrqGv0=wuY3pPNlo+NF6Ɛw7qM 'Юuo^b'd蓘^T}~v />ŧ!*nze5r%<Z@S0uVRieÁƉq$UL0!%wr.6R iQxD'stNfM%M,uKt)߮iLAS$7WG{WyG@h(q aT80sfa=&qh\$Ud>UՈQd^hrU!)H̔NL;*0{2[-ZˊME}eE+)^9t燔 Ԃ}tw;jvEٔVCPdR!n// )v:_+)i@<1032T;EVop$wZ{Mc_,*t_^Oڜlވ??{WܶRsV,%Z'vIaǛzab["0ӿ>fzzVyn7uh+w~J}xi uhwet;)r* O!hV>s4!#4-(t 'ǡ -&s! hDe/~!r}5T ޯD$Ӂ]F$+ S޾|n4Et wvyqO=̓&^\ѻ7=#_ \s|+xElXzT{^#4ZSN8߼<| 38S*{?L'JWδn <䏧H &+6YДU~s s,Vz9ͅJ KƫmiR ]L ј~/DH )j]C&`aH9mC)$%ٛe%7'/䤕}PeΆXNn>wRXP+*5PE+c"9T,v2<Ob%K1*ĸcYYE s͂y3D?2c#6;2Ka#C_Ed@r$%(4BaF -1R yk>O5ֻʨ R$U6>yI`1>;@:ξ yq_dbV]2Np1M)Qh `e (_ 7`C1#ma & P3Zmt"Q?+CLbo"L~*l:VXj'\ .5R(se`,M(]lG Ȥ*\d8{3N0-.Ng} }v0Szd$Or<\|ɊVFN!~{7/->~i} ADt(t)$!DQLG&PxG(3BwP5` CـhWtBaY1.Jߜ(bI3H}/x ɗ: *e 'N?.jws9`(~S#\sMp\ W9‚ *-k DQrZ^m֣ypLgR#8~5@s!h$yPSdxϳ[`}0Ajq`Pn\Ts MP;OTkcSޮ?-x’޸rHH-gd.7U1/_ ޾6%uzsKv·Ktߧc5S5i(8R6Zz߫n"Ƽ '*s cMa,r0-P+Q;W`O*zau'd`&¡+DxQ(bLY]bH5آ8RR w9Gjr=Wle=j1,+Eu"}SM4UcC4Ԏ',2Z2lD1pI9jA@PtJ~wpUZ28S^Yֹ`:$o.Ud yK$']nUyPDtQF;L[nknuH\DȔBvX.h<(":U(EpAlMUhQVEtG d8n\ߺ>PYʻ/==\YbTb3Ƣ16p {!oa kw]9/~alՁ  3q~d8jpϷS mfɊwnR{AxwAz0?0wAC- h D2nCh!ux GԊ(;H#utWͺd nM|~ȭk.oߪ1SV g1LbFnhwl}yK@:pN,ܭߍ 4GZYӒJKbvEbo~~k<@rbskp@^0&pvޱtn^0ˁF,{.ּ2 F4?_d. z}?sgn,eWn4ᇾ5yd@YC*֞O@zr=If*vy-@3\,WW0Z]~q(XD`u?—s/G@'q?/7U2wYN~^3 G"rY4k$QLh\}~, %W&#ENc,78[^#~xYI&C=9 ¡d6 :FG,.Bj(EF !#GrWcl jj?R@O}04U0;>Kyc+\ t2 \;\%$+W`&u)`ZkeDo+'VW@U8{_Gir4H[Gv~o eXw\;T`n8;vZ,k(Yn9W62dxeFb#Pl (Bģ3;P0*.F9&u 5QȰAKA, )*1?.zy +}vVgpGÖUWƠ]g6 tp`1[v?\5o'r2 9RlX)>SCGr6]?uqD #y%N_l@EzNyE`$E0P jbU=+;Ezͽ@tV4˅?;9;/gzŠd!rV$NZU៪D=(VY8#L%s@\!"<'"gghK'%<;@u^2ᅐXK>ISK{IJ13>?>v::>-4|0]~, [f뾢QٺnJ^-WbIw?!2:l2O.>rb?@Gzo /w}?ATV-ᬟ2UHKNʬk0DL&&<(;jb:+ͽIVa`d  Yjz& ATW&+stNiiZ%I$N$N jiI0:9F:$9Ե. gJuQ v֖ ~3-iMxn[@{>e ]lBk[Bvc$ʕ^*_י5A-#^xy'ז)|y^tQyckXL@ˡ(7ICr\yGZU&)HR9s~iGMJΐH!^nwTS6<-^'@#ofl4+ Ĩ0%lvӛM Y76&eX7h` Vckٰ?h$T(#!s=]ȷIho^޽rNZ '!~p%ϯ~ct>De  JP]%BYy0cR[(BH "yEs$ #9? &O8r?Ïpއ>\}6\\5zmNJXs*0\IB R L;̔qA1LMlByq &DYQU;V!+ jr!6,6_d-!DNk30#kE80 k\U$ 1r# p/(PZU>+1crF ,top!n罕rӫ{نn !n`j,~U{}_LJ5#*=ho2]b.)~ۗ֍P v:%2yݮe'FA(T>L!DD[_w/z "r4M|ӃOq?N|j;,cxG BSy;x`F5h za|U5sd (m7Zh\_]Aƅ.B؅CQ$\ C'c$F+ 瘃Xu2+q[dK?sMJP~vuY(^ic$RXgZ78%R~$}q%myϾqjk03в"y|fok !$֒H@w{zSgR{a( Z)5@N:$tP 1`p9T`RiƎFC45\a!f_qb=<\2' 3xkYbxB+)7%x;FVO4:PhU\<&dv$ItCti0T., Fcͣ$Mn kvC{ uc.Lak90HY+s8XV+N Ո{@ ]OD5Fu^B Ɵ>=J:Bup׸P'Ɣ1$m?͆ ws(63"Yy?oY|ؽ_,P{1\`?1(|j_Ayf[iFd耐~Y}ı`@hgUFJS)+/I?_/7fSMtxA4mp}; g[Ls8!\ g2dVRafX4V+>0(/(^8dFye)L҂nHZ4+l{Ym^'XMWlXן)ǔ0114Zs9Wmj0"૬n/ٝ]7[? wDO'YnZ ~ֹtNwJW̮~X5)e'SA RJK7%8Ér) k5#inB5ݩ^IƔEpBϴ}c9VM%;"7D8NBZ-@McvLP9\IJrt5pӨaP`}G}J Xuh1Gn#y~n^l. 0?̖0dʬZD`$5aTʫDl=BhQH7i.*.F~aȀӅ5<{S16|[)c b.ux5cc&P㠭LoQ4ƘxB:YvytE:bLQ9ُ.W""}*/~|Q5$BM;ZŸ'&[`j@yR  GZ&^*%Ȃ[j5e]8Soz1b6MjGJx}VW[=lw!(XW)r՛OGjI_ ?˟vmL"V-9>-w7Oޛv8|WWh WFaFhf~gGk@lR&+|^` bqDǫ>4V?"E@y66'HAjCfa=n] x̖2Ֆ9hr|ֈB^I(_)P~+U]dxA%.ݺM`$n+`@)+ <ҩS\%%Jz*$5hd 2,Q+Ug-\Ƃ`ZǯӼsAc#{}hCO?Hn!/ǸhA< y/򆒩Q"pTKBĉgϴN8"GE?jc3] 7w?ނhMBuSlY$Vb9Ø/+v6O^]25מAr5ԣ!/\EtM_һEʠD>wXBJa-ޭ y*S~N -VeT'U†{-Ѳޭ y*Sݣ]z7)[bePFubQŻ]n= 0UmM{hFz:4䅫hNq5F@mn2(:Ϩ.n yhFz:4䅫hNBz7[+2*[Tmnьnuh Wu!Tc':r@c_+:znn5OqT{sI5&]eJsLґ+Y[e2Pョ֕\ *FxZ;8m8HG^!.hw<Z_p6|O<31e1-Î@A:^  8/OY.Y` Ҵ:HrR{J!~{y6Myֲd6}™Յ,`/4arzl.)gn3K.JȮc׊kصblS+ LPQ83d׏̜F=ps﷖eN&37g3}Y!gϸ~΃{,b4q"(FUFPRNP%L.v?EAA tz˼ ߺ0 O.G/gM^{x|wu~4x: ,3o_"+0S;adq”BBQ.LG&E.+lpaؑeZ$Tv]^3Gt=Ͽ^IIICB-:{Ǝ%  T۠_Yϻ5fFHlJ1YĥsA/-W-jj{l$ZimG&5"Y)2e9\j[,&耳U`_!\w=(=vgcRFaϱheN$ =M$# C!KٺkcCTmУղnxԗD@( eI0"|ka'S3fRc)q 82.X&""`PE|-S/ Rb2V ʺx) '23VIxe"ŧ Ie73˨ !&V&\KJAmq'ŗ2*S?%ib3!j g )SbADƈ<B݂U+X{3ZĦAV&GG GpOY}] mA~d}t|YmVt`NuFfz 7yjoշf`wI.9Bs6@+Ϩݦ&t 5#nЅ8좖gj6,e.:Έý?Tw!Ѡ!hݥI"cxE8;cǏ(tIp?O) ;΍!?Xaq}tti#ƾj})瓑/BJ/9 u99+tV$Cvq)O5fX(}x.wOXim6ʰ>g(}0@d7,}Rq6dd9v#:Bb9f,?)hVޓmd+D &l׾4A7vyk7ښv.OIIDKQJ`##"Rusj)xw8&p(j9HFჷwg΄z3fOL)URcu̞ǥ&ʮW;XZ I쨫#ٟT;PUVɜ<^ZY9׫OU:);Ȑ4#@4wwg'u1J$<:ʦ Ȃa ,J$dY`! S%g`}|+ , uU 1Ԏ7l  WlvZ dޖK} C3jq#ɬF'|9mP3 c9/&#ӎ6~$$~V@I J}1x8}gM˳ K!Β,+{oLG@33kl$1GXpcu`^*ͭw^# BGg[`ó܃<YIā^pZ⑌ TPrFϩRCWS0D6ʚ I*1+l{j3~pQc#ҔR_;<'E\t,v3U`cCp5X|\(nDY8o0 fF/(bZ-4e@@ c+Ƹh96+Æ[$QnZ~=eO$B"6ǐad妽x˹a0(39׫R9c@B lijgޗ*jY0uc4v'4~WYP|:qk7~Ȕ:jaa#Ls9dZjR3 S"fOp;7xra/86xZ@"m,S 퀥Y 1DMrCJE7њ.4rs+$[P?>dzCP! ,mNZ@NfE~he\ۚv#a#B;s >wsf/_WMիZ|]1unrNˉf|ROļL?_ա)vuZT&髷IWl1O]OxcoIPE5z}zG?4ʿ5EE?ݶ#?_r$Vz:UJ} ?o^!'كo% ߮Y ` l]KLaڙ1t  go &p_9{%I#G&YGr ?ީk]!|%y18AgOKm׏ ]ܼza/$>i(}T ?(@qLa)(TnTy@z#\:  p(=Ľd`p4:];B<}Omׇ̂p#ZF,ד2a'(F3+bZE7TWMEj@!s= $jυ'˺(a+LخKMfѬpyY|߼/˒L /xwlP`//UN`Jj8cD%GW4^/NC9h;P#q=#ŜբǕGbѼje-Uw?QBELy5문elA\'J{_&*}Z,Nu |19l)aSéa!Rc8%?癨HJXLq/\A%D bPnsuX7e_V|ZgbE*K]+:"Fݷ{P[ m gDB%kÕɮANbN3J`ƽٸŦE?̢='w'ύ\w ̧4tND/7bӠ,w~}lN&!2717ub/Il{u7:Fiz{Yv*4ΦtJgz4jU7Ltec/j|1nXu(qFr5ٮY5@؍yfw'vZ|;`D&:7I5WEo/>?XwIΘIt$?e{%3hYpO$P*+?wU[lrGB!93*-c&ѼxW^4+ic3ƈ͍iQu0Lj{Xƍ@ln?#Utq Zlʂ2g.'o@u~^~Y '@}e̙/V ("OZDo@ $;g;}  T`Q]$7Z U7|Xk훬(XT{ɤUiG8Yi nGtm4Kz+TTdew((XْH ͵|~H#Λ>75[xܿ7oVwb*&hԱ2h-EE#FE\)OiSKҪnmP;Jd6IF77ɤݫ_C[C #>-ssnbKn|ܢќ $>΄hxk FȄ@7 NjLm.mGMY%11VC5`8TX3h灘tJru="`+>WΐIe9hFO=^dt@|¹cmq.P@>6. !8?䣜@O'?z p?'z|:zXWuLPH BʥH",z eHyT(U#h!C@pysZ?nwD-Z][s7+,>ŵ!EUz9N*9kWnORa0kdxql忟 EIbEݍFnvA<&,Og+zmүgD G(؞qWžq 璴 ' ~8Ō j0dz_5 :RAV6e!8%ԗW+,˝ٗ}oLmσwĮPhP dD78 ٬%OQ[v/0mByہO= >NF_)"A+BS!'UV FwcK+)$ < H!E䞖 ^+K7ݏ?Mé=wl^AHPQGՙ\Qb|n$ $q*uS "63hΓL! 3s% #>qdaE%if/@j(8'HHL[=#pDrs8 S,׹vas3@T:-/A$O }E$2Ԝ % ɆDboE}UYֹ6*"Dćn}r0;j>tF>{7x2/TAT:b:|_2|ˋ >syBU??t/; ;s;l?T-#p3WGg]̼Rj޸eUûa~*[vu%*t bIOn\,{9|6$A(Ŭ"VMɶ':mlQ{J_)2TD|2yXZHV Ȍq0xSZ/γJN@9\x)4jNϷ~N SJHajgX 54FjЂ!$3UD6T$&L[dCڦusQGN~*"R$A* 1Av|2\`g|+͐`!15L*5-H3f[f_5K d7> jOs0Jf(U_ 4+'ȇWa IH"]F W "T ) ~Vu8,HӫuI CD\ m`1ʌAā59`&+^a``D Gh7% O~OOlX{I˧;JJl&PF~[dIT*o _2w@ Xj={KK:@ί|ׅi:/cx`t v+_( tYbgOUGAOm@"뻭~sX_حržs-w2KL9sl{܎Ƴ/Fj(S뵴PF2n!9PN?9 r V yyzt TOsk(Xg|M5#A9mx!OUgQB?$61x[sI/?Hcҿ<M*߳d9 &&&iJy ̭|M`6)`.gzLb|TVG !C4S~={cvGq$P :cnsPr|4hvk!I]Yn2Nu SEݒnM C4S&L@W7T?^׻ȡֿRʳ6DN fuQg9r $==ri!Ibd%%P`)xJ(($QKRn C4 S4;} b:FvBT/dhvk! Lx8JCO A*.X&RDc%xA׾:KƬY—Lx'1 [I YH$./  Brf,b?lB2gc`SI;xu'1XӻeAԩMDnXtkYU9(oleӹawYPq!e]θ)-e$(N1%@a1KiRZ Ml2Kߗ^ QYԢSIN8 TՏg~nTx:+-َI21'3~E5X|a^p0qb> 0aA9)e8FĴ0%%cMLk!IR&P-U1Iv;a wnnM C4Sɳ+j7; E ¿c:8]vKf4r&aJxB\`Og*i&U,43* {l!4XEeX "\@,CRHY,g1αiq )ɐsdNI"k7.!;T _ bKuzm%,#&!)I~'n NMAT+qZYTSql*ё!>Zڍ!Ϣ^X ^cԊ̞΋9UxTF# CpLii=4[Fe `6p5$OG dhFn_L`@3P&NJdX|fJD2% t[0]|4mT/v肛Ɠt\S -wsR/^ FI'Gh[YY@yI6QԄiIb6ٸ~2tP}ϓ#)%(cmo HO4`B&. , n%&l`"ގ$4h缗p:It&f:6|Y2#ǷUF޺ 71M6nbzl-&t0~֖5VU[;G(&^zH#NI1?z"S-)G#ySD!ͰlÅXAtw%m-ʮPHɌK%9>//`6 2G׎ڥ{p1GJ`zvݲ,tϟrWp~8w­`303onz\+ }+O;H Wa}he~T`|QQӻ_ F=H0}_8`BaKÄţos[ Co|Un[9Ӟ,q@|P4xOS{{ =:kDNL9Q$ 'Y p1L eF"\ L9E(% mVPsO5׬5>DGp&`)s~ԣ#$h؟y;ǀ|+v490Ы"U}=XP2{*l[jj~z>^ʰMo>5w)~ٵEG<ӧDz1LɿtffgaUο\e-uLgGs1aB|Mo|[񨴇+(vمG쯣{k[Y bj תf 3b/x_G:_ލgGsS|R-y[˨Gt6vf ZQք zAxeQڏҿ Ou%nng&8'"O?=+=8< )^AHZn {{x*} c"C8I2pdɌƖ$k!cʧFC]ΞtpY96287e2 gO콤3/<҄U7^xKЄHkkMڃY!# B(zœszO՝(9βS.d/WjKYh2{7vk=٣L>2{c`}/|=-t70c'p\fr.Fw w_3 HCg`Mj~g5ĬfY>y3uI@Z ΗNgm񽙎eOi _(uWgb^N@#%gaÓuLgQ*}\N]/lHOXnXw]²wl%>Ҿ]%NLʲP;H$ z:3$H8,.L4'œ&xxo.x8g )Wu ^*qn]Bͨ@!gԀ H01˕er*]D%cXC|',M3Ci3}Q[MN[u-Fn!TԔ떯b?wKGB> u3w7%XJ P<͝V8>zKTgdlAl=ΏGzk}1ub>Y̯{7O./.7=TR*P#9Xt6|7/bJ\^6ο\YeKoν"{jj~Hڂ2Vp ^! +$[cH`܇7/ح\61& q\_ Ĺn. {b<`V7mRkҏQPHazU?A`8vt@Zׁa{,xZL(EV,1J^F6L%v ]3,ʁmp!jɜ [m0?{WǍ /•HzpȚ1oN QrxZΗ&o%&oTI9|Cwc,=ֳ:bvnC>(6Sd56zf?TW>r`,r9H!{ojyK;)FZЩ&6 NcEfޝl!>[IŗZKÁ7f| b-)|!;xI䠽2aȷߴd+n."DW}ַ~j%Ih\rן/߲FCnٻ9'`n꟬ܱ|U *AG[Qs팯/_~oaL+AC4}./ߕ'o~]۳IPF#ahYn4hbc%I#:[q)y!Be LHi:Y+m[Jٴii*mbx;+ƸyUD= j +]'Jv:tɌE͛:^;7׎TGdЀJ&.8QjFy@-]t I#5-&)QkߞMH.6wUl+CRM2@c]25حjFWҨ7r ]ҌJ'DvKJ//BKfdh:^b !㻀Ӌ.ꥼ ɴO*卫8 ޡR~eS.2 :Tg&>|YP>-͢уaS3zKHaKgT?(DF1Yخ5鐤Q4sޗYtAl\V8(Itb[i7-~ﻪVrb"L[[sWno*M)oh(yR_+ALi071leŒE–T ڞAWaZ{ƈ[eы/fII8<<, u)~כg'-j[*e- >%,dG㨃Xv}/7+p,k_ݯE1U[C'?v]x}} _:~9- ^P.;&>/[Ȅ&Σd! X{H8OƟ+fNVQ$nr]/ =$jyyn( );-YxxNjV-:In#БLݒNs:"j5 LG8a&:Ic$93k"HtΥyox#'"`%RDAH A2*7I56gdw]DEȕꈚE$I T1.kZKow(Cw>Nt0 I~?ݏ.hz-{种nCIl?YrN)#LVV͠Hu?ߜijr !w \-OzVC$X-_'MaO-\q˘u%{)@kͧsh2Lݾ>ڇ5N1b0I:GApGATh{BǹS\K߄Hqh3Jc?vajq\kFI@IY'f52r"$X(D0C104ZP Z0e$9h oD&XF8g>wF2D}5t8=yOΖXQFX.-X'7Q>nd2Pr&6bF))a-⁅[Fd>'{-f$dm(بL!C1G#3MFgWuχo7nc,{6(![i/n_nNBp^@B+Cw`j[4Iӟ_}χ ׉\v`)_)b%Jȴt1jRzK} p=Ƹ`]Ͽ_DD;1]h&\tF;}K *gG[lu~ѮK-\#YQO„q JL ع YQ~놃bFɽbwXg˼:_ Cu?QdFj5鈁eeXHP|8{j ~zy/$1jj&4@/q 6˕vE dUIA#"m9<Y)q4UM+Zak/Bp0 }ou76ŬKEPMʂ18l;(a̟qEZ;iTI$?u?9"NTQe*Tɱ 7];\56:t 2,pvG (N&`,|b#t/8[P$lڤ*9X}Y#q_uޢ\ik# HwDCRShJ$>2,ߘ+\|֔޻Q@z42%$f7I5s!9SxW2oD$khn)oK;OC?,魂i9n Ȩ2W NR1M1hAN⍞a%ߜ̵HLdƙIDp2 > JbJhs,:N1؉ScrCCTP "9OC¯`#ӂ* ;&-y< ń@*p0t%#^|H )X]{T{T="&O)?1(1־J- u.H "^t%.5ٗ p\GnS <6 ^POrՆީ8]Ց\``JiڠlxZY~0;Q؏ zO SZgpkdHf_(c1+(1LE-F̡z~XѨ|Q6}`oU^nXj :!d'~YӰuđ8_/֏Gx豶zN?Lp4s9Dʴ!,ʥQ/ |X& ~sd `4Ü_R5鲎Rep gGI^'nuR;]6N~ㆵ_!^qN{U'`mZ'BtWb"Tլ٦˛]m@7 ՀH˶yoP Fۆz{ 0s=m/FBCPt'+t2 WQ$EN 8mj"cD4& Ct=1H$%{2TG_B?v'.d﹋㮹vvvvPjq\emN(PH"d P;砘=)Fca+ htqԁFD7Ъ&[oghwHwW=SH[84@}p%sKBJs_VCM; J$E$v.D&PJk:X$;yK?D(+}>o%HF>Bх&Y#~-4ZX$=zB#&ѵ>@//]/q8{Hy\+_ړԛKs-u[1T&`j-:^ Kp*$`a ўϗj3L?0tWk؆jj9R@f4 >}І˝&31{n@M4v=-ӻ3`r]%jC 9Pb!Ex5 L+T++&p̺$R'ޖEɗg+ _E\9GrT>zTpm#r s%Bki*6Zsn ˣ}K&#g\b |F[Sk/rmH*pg7;&'IprצZ #kmR޽}򱱖N0,|XFaeAAB5] d#L#Ifr'3yN1x~vőUaӄ RM7,wK^B9KE9HŹ9֚mLx%XքIܝ%i@=>6:zOe]9nț/3\"(!I1k!MQ4iH~Je^^"q%Z_VE"8 ލDO1.@(z?HRB-fnp\ PDž-vΊNߒqx;c>3Grm3t7Gx2Q0D7Q+t]estvK5bf7:TvLønl &~sHO980[.Ѫ!:F?l QM] @{0N:6 T C࣐<9;A>5! ^wv9<1"T@>}p+rН_p;ˑ^H`faw>T*~C"J&CodRUt>9ӁVd2,Jɛ&l8J$wQbV9]JP^/&|%=>Xe=p:ˮ @Cė.$bDTJC3eʽQJ05RP( rF[*padDH#f5Ve2Z Ě/!x ZV.FRX׷۹#b6yB1bslvw3ey`'ms 6͹q`Zoviv@n!@E˙ /w?/;|q1ʭAH}]<)u1w lpPNM{@nC~,z:O1D4N>G8y+o5_{ב}ə{V`p˝E~;&7M?^8nn?h߮ SnWcoLSO+]#kY>qMvn4YGJ{x.`Q=2|$$F^5X,5CtAۛոI(hlIy[j翚i3+Z*0 mr7xgyml;R3`/gbgJ=v<*n1L~@)E jB$%eO ⋢ykSWM. 9TWMg)*OI.we^M4VRݾWA$i-h#Y7)xFWiΪ}!iSwݍ `+q%XJv_ }5VcϜNj m=gQE=|M񬉜vXt ]qtHFUj#%,k}-v:o{䒫0Sjt(R;:3[%J0թEX4OMVǹӃ=zzP^Ǧ@o8_isI,pJ;B D+?d+Y]:i{<ΓJSR8j.vqH!<%45k&)_"\9֐ ,PyVժRfMI3& 5r3-*oJ~O(iʔ\ڈdquuj_|q2)J\:C ;_PťF#aDlums?'Lj!IKcAeXR"VJgl)geXt3Ϭp|:+sO2=[m#)8M9ݛ"ũ"a66-Q8*U.D/2(ri5I j#c<r3j% [ qFs.`[\W@G('tF}^H%u \-f(ܚ3-I91ƣqyD&OO,939j,Ύqc[D1PkhD(,;ʘTR]uՃ5ќuU%)=&4b-'d\]c`2LW%WmrKQnbJܻŅ\}JnL] \cdpR”S| pq$-A[ TE5ILyapZfvK!YQiG8r彖pgIb%%~:KS&*vE<=bg_cIpSoz3Ƌ=}<s}ۇg'O.9%痸4 N`: SUya=A<*rAtхZ]NS,q S:Pĥ~2+S/"!)' b굳6 8k 6 aVn CɂXs&B En2 JW10FJF?9RԨ,W$&Q1u; EIɱnϨmZD2'hh{V6c,b(UQPERPcX `9g$c#aCC :EaPQA &p"r7TPtpAnNr g92f  kj8T i1 <0o%1G4Ԙ?c o藇A&σvCK{\4 ]oz;gZ+ߓ֟߄/&5ȋ^vyy  }7|M&w7"C8߽e6_c'w{K5|pXJ"Ff3S}!auf.`MVTq &D3(<")xh0LHq^L2%|kx' ,w+FP+FBWdZE)`Y1/4Nj0طp9u0C,B0(|oAK!X@aQ=p aQ"Ҵ}r*Cr.e #l2ѢB@gQ-LĀc$Ӝz7=N(8¡Ays-}'ٻ޶dW>wBŀ2g q@U"L$%:tD!lTDz>oC`r5V\W*/@ 2<@ڊdv{v*=nx #-8wFݝ܀.AӖ}##)6Z`c &rT% 0Uqhry 9!(a-bfAF҆{0;Z+ORi@=lMkrϬp8i)IyA#%L ""a."4Npz8/nYۙ|ǻ6OYCQw[H|Qsҡ,6ePQZ) f6g¢hep8NY cdXa1O@`JrMpLW~[l=Y ?]STr MDz>b Zs*E%P7)̈PpQQx⨁08#W0BNHlp`J;C2)]uVaV0H3Jqj10\EVDa`!Y\f="^ ]3eX{Y~YLO{8݄s/ʀ{SzS+b)N]0VRW(E]iy I&۲_IX [emAZ>@MBW H *@& ҍ (lt*,H>~^DܶKV-%_^+jM!'lAi l I ;j"))QSF 8=JbEP~ҬKT,Iz&~\~еO)__T# ?lA~m tB $x [I 8'ٍ B+ QP;o%x\eLFFH ٭H cvG f^ q^x[_yVXzg'އvf@vSQJ|]M1 Y~fG gok p/"7VWfUPLODJDIʵG6`0hYH T HJ5a*CtR"|gi}Ww%k5iIP R$b a &+! :a^D»I R<''N _,BM_aE [$̓)T-֐L]8邢qk#A칭YޤQEѶ֐CJR]IU8$o TWR QBW2 %$BvB+3g#1n%Ŋy,Bsj*:*\gh &KgzIkyoNzߏǫ3uЪ0|;{IRlHABAo۷`֨wlk26Ro]ƉnВ1uc# M\aF;FLT[m8V'jzIUNroݞEo^O[5֒Ae}v![}7vd 'Jua+(t,3VXAaE;ny7nkYWMr^(gm!v-饢hYieV?4kЉKYjcx!7Dl<>D&@wZ.~S ?uli؃ 59ao5hyp7EMQx9!Rw=RdXO oaԟLRl0z;¥!RBQ*f$5$j!vJ>dx4i8a+;9p"u$ ZbL,gx4QMHq/[: ЅCSS-HBuz8BU2- M`lVizq^ @MA Af* Ҥh$u(YEPZp4TNF_¯S QQLQZ, X~46oxi YFV idG!,y:X^JexIq)w\y,X)-S&BH_F{(' HTG1&FH}4 TbȲY2|ZYɴ X0IFR#/s>@ D#>HBnA eB)D 9B0](@SdpJb6>PP$^ 4(ثeԱvD#=(FbLb V1Y]0)!'+d`0R e )GQJ ^x%ԁj& B%_rpp srCH׼(ayqCn@-[@`o?|LScwe_TY >@__{q8͗h8! 1- _,$MaÆƙij NE6$dNFF3j2ux,tFO0cia a J'C\./i^;tݹˋW,CnPwa9]Lo/D ac(;~/pKd}p-~t{|L ]p=R^v&k>p_D}ᗡvpZ:Imޣ{@ݦt[y{!ϙ;`̱I}U4Ԫ;y,\_{a'jy]qfcǣ:-kOyrZڭX ETpYuZu)hE5X;nt,Y &w.Ea"h%*Y6n붣89FJW(TP(`)0`%ɹQymC^kH 9 #MTmhF1uh, x߂1Ӂjxq6f41AF)Q  \']*d悕$,wŌ@5ّr=^@B\YCb2_6RL3^{N΅HnOj6 eK0EY%WJlHh.|YU5$ co2~t?K ;O.o<)Q׷]v>"]u(cwbK6󡦙tXvOoOju3jY3ݺ Q|RT_=paes_a(m7[?^,/?F-qׇ_,$_$2h`hrWԙl2A %\o:dV/;*aIbI9Y;l2Κ 0 ֎tkxU\ |sdh|glGCW r?֠=mNӳ&٨tUhB`$MNkjz y.!;8i Av8a03F΃_=+A[%Q(UeCFDRfY`Aц.Y>dtϔ( 2&֠شJo5;>vq3˙R pzF0Kuc85h-Lf:he7!-lѪA>V5&h@.VX $q@geBç[6 qu )c\)6@)WWTHхWF='r;H >aF谌]Pb^E'>G81BUpN_r>F|\~݈"S⏧]3Ϲoe>F<߆T +VW.&W`R\@c3ާatt,. h͉8 V~yJnL˾Pڵgz @;C[W[Qt&8#$Q^$Rv3]'?^5֟am$Cu7 |NG*BQW9 ː=eޛjkDLOYvႋx=vgl #B\ܖ2k\ 8浠iDh" rqo#ځ\# *F2ǨYmQ Je8TEr-PQf T&x drMpB ôRM;IM/Pi6jDORST;un>xQ{Lh A@ga$ʁ@E3iw!~WM5 ڞi]8/ŊR8/Ŋzbɡ6)$M圣LX9s=8癑5I4.F:Qb-׆fB٩iaHtߎv=VG|U?oJj /Ʃy %`i^N5q3Ig)$Hr471 Yg 3A3JCzvRgmmcuNN >e.Ö7KZRN [[ ܚ;Ɣ ex%UEY,jr7g 52XMJ)Pa`.4`(Ւ-dsB Ss`R))Pi`"FGr`XB&ɓ$mV,=DP`T|;iI+DڼXJI.z\\^%H\LI !)q$!6#+#ɕZt;5GSJr],mquIYy/BZC)- -<{=Cr_$H]t/u&٢̇r%ݪ j_ĕm4*?p7CάPG5*[&,匈Zl-~IsA1Ւ'/P3!D=DY@E)P6_nIȹ$ϥvXD[S7%JSYw5=WjNߜƔ"NjOBKc;n,g|4_,[6E%PŀWHA#nE"̱.ZUqjvV=$)\fWqKͰ޼;n3wCN>V`"ͤ"b8>#śgęK}ގ}?5ّГޕ5q$ҁYluÎXHao %A@fuh\D5 IP @/L$VC\|/h0I?J"k?O}f! / / / 0ݳvR#\QIrF*RQK;ukls)QL߳iQ{6-;Ox|܍㉅:o>gxvppZqRnTpi ΉQay$Ҩ0ޙ^9g3*.hH>Ȩ0:l6ąD1ULߛ̏&latUi:zj6D@5x+m-@%C@qzÜ1g§8}.XR-CbgҌj7ɧ}Hwv["ߵx)fv\Gnk԰?4GԧǮ02Y<6" V^I9sXaeO1w{[Zжj~B׾|K`43DzצԦo+_ W+""[ۊ{e\'7Jr}:OQHߗe?#9VrACx}3rK}2.f{`'M$ju-d(8yQ& T y|LS!35B?AR8Gʢ[tH# FP_֘oe ,Olj8l "&&)8B@ gm cVfk~=wYZcl*izقE>_}fEџfu}ק@NQI%TE5z~6)HObýqa~-hDit}Q=}tFW (* Ϋ,<:^/p"vreQfmm̩3~n i!8|g5};iE.䴪i5ZQ~lVޫX뵌c˩Vbmy":\eqIBmy9f+p!M%w)˕L3T :)׎BP1}Re jp%NBrtD1+.i+$'")|Yhla3y7IEur*-Ong42ϴ&+%~nc!)'0Aѹ_ͰKi(f$iNxΌ!j5qMWѧu0p b맟u5{xETD() 0 `gOX)y>{Zs~^ϋdW#.,T=e(zWK6x >Jqn~Vݏ_IVX l;B ^ ^ ^ upqfCDSD B3drT*1!8P0,ύNg${ێZpo*tWm{Vg͋`ժV1!|ڐ٤ $D4[ f4Bٲ8qa0'#A . x !M )7% 3ʅ /ZϚAC,ߜ>U1+ c +3˳Ba!VJZ`-2M/)e^tڛE;[m2!h^vS9F&e0 lFbepAsˤzEHs`NՆO{ dWfw55ٯ crUIPm0RI߾_4J(%8˷ߑwoo/Kմl ~x K1s{#ᴾ3`2a4}"CͨO'7X5\H/lQQasٔFsRʈFdYLyl x"כ6 cWVs?[dS/`G=FR))2\ +0o%؄;[;7}ͪ$@1HN 3 udJʔ"QIiMk3:=9 iK(V)TT&3lNtiئQɦ9Q}BA#-tsQ6'>Ó&De)@C%@#s2ۍ6-Đ/@^ FLF'zâB[?-bLҟtSd^(aYC<Ϧј% Kx1Z"fٗj1l"1o6A"2:o7  {Tp2Le ~Fa ut1lRL-~V# m}-2j\H (-I[w vMCQ,^pCF郥H!rWhrU䖥liBP;6xlO>FTVG7F/:f8caRn}8 SQH2KhgRFry"g:բyBӑ*%RЯrIK3#E"չ@CT~mIuwK~T_:Cg#ǼwTۮS \-?L 3w\T[G"jb\)b˹؈ Ոa"DsyPu]_Dxui(h?6AfOHyDnzks14ʣ2:}3 +;g罟nsSrLn=VM%Xӭq_OMn>0NGp|##% Jׁ8K_{@,}s"%"eiV4Cs=rxU>G9zdP(@:qin K HBq9$hL{]N: G! n{ Zcڻlh$/0s)x'.`"".+bȻbR7O9&C=&ÙU(ٺd ߱Śm\1DŽiZ\Y GvMJvlGFT^tkR҉ed]D$ B]|w5A\ ]1c7G;JLEJ~> D7)ߥ#ÝZ( u,SHk'QBT=I [cMlюG#fHku6 IjHUa!8!֬ey=N~̡%4ƒ~ꮡJ`1s/;{#K_$HUD.mOP˽\S7ARȠ_AXZK@_FKۄՏ\3F0|F2$9}kև'8zzm6 )B15܍fwq#B8Am0Z>k%`qiA/QKr$Ս -'+x,xrW6!6)l/u/+z۫kqMP/ x]BR`u-}Uj6g')hTE9-i6l9_Lo] >-C#5:MziMώԚ4+r3{ޫ9FHnA0Dxת"WK)|Y3FZRI-% #,,X׫6DSQ @ݨ\R<52t0֒JaL m͋<#Hq`|]3ZЌVi1:!N3B8vF \aU:lb.xaH:Z0ˈTvjF=q?)"Ts"$EsR4Ɍ(#*o`Eɏs 9RRs28 26GRaj%20XWm䨾Q<lܒ}vW#ݵ`WR59T i?A_z>RԵКRPwIƜ ' .\M`}Mz:ik^\ ]0VUn-a%\/x"ǍsJK䯕'FVƟ8Vbnxsz~#к埰hF .8Dx"<u"8=Ⱎj@5,+`3c1yȤ,VXe"VgZXep!l>kuM^zWrIێMN`] yހ 4+Lf]_0oY J[ .Sui]YoI+_ LQw?,`gv=;ԀYP,Rb"KGFMSU_Dƕ,>R =S&8,dg#.*,IXszaF la+1T 0.EA y]QBvLBQxӇ6R9 k-y<,}VLG 7A$%tI+4>eMj=G!Tm(1cd˘ P>nKe;2;zesvvъB7$ےQN ^ ҦԎ0ݟM(VN[lD r-ĺ%j 0e+H ԰Y#l?3fnz&H#V@`#+-Dm iZژЫhλ䥤'49AzM?PZ @Ҕ`VV- 'c]>6 ~Fx# |dCrʌl'g+PjG/>k] Fl ksFSF&[xVf@xҏ{!ȇNI瘄). ",Rh8!)Fo*HRd/ġL,AMNpIdCHV9"2'> yƨd8ɹ%s9 zGqr4d7K{eƸ7wr,ADeB (v!H0312;dY Brr3GY^R,I1ʁ 1KJjd4#%i-2_Rx6*Y8R# 4Fl̠(nD+cAs%XZNQ:x͑ (#m),/ nKFbC|4 kke)XyNi%8+^HBgrJ䫾F+g$xS&*+&V3mZm]Pm3s%x"͗SZkva:A ՐaٺXOC|1 %`*~j]Uv%*;,eN|5l;^u5*N?MM{)dQ5 4f"xREac٩yYJlflP`hC_yGnʛxCQA|J=Z eȺ"ҲdiѨgSDz+'EFO=hDe2Y)Wڦɬ)LSf!z#LEC_4twh /"AgRj4WUn kU*͖iD27CǂQXWr++FN'I[p@[dK2!"- YAH&g\*#2}<ڡZ\=Ӕ`)yIJPr P ]]چs+BXA9SK'C "triYu^#'hh~sc3 kKꐍ {CԎR@M>vcw24נF#4'wld]7t# %]0{v-+/58(ipNGJh7 %x+ov y V&wDy's+'EFO=h7)@1C(0YP$K,MIZrN̤Burg%w|g+QϹ@\U9BҘҹӼZނ' VE}5gہ2U@ixv=~[܅YӦm\=>i%_d[՝zpkZJ7m?U=A=1%%~&SŧpXwjwȅ? !_6\cJA3"S.Ř vRYQe@E4κĜJZDz1q$LץmץzB`K=XDﳋ~0W{ly\yy[R>OݪAWK">)l>K7Ca1q gv8+Z2+gRVYէp\ڱRik/V*eM ./ģv8g9T=amة=غR[9*-JӦB09wwG}9Ϣ%ɽP̧a|)m0ebJB;:c9myxLG̔)EO;3E ޭ$>*[y)+4m0du].sw2gyϞ$_\DȔ0>KFa"~`K敷y"`ˍBۭV/B:C{giI<Nb(˟Γ+\i7.xϦ}Wnvg7=>?EU%q^%쀗Q` !ZeRuL2j]acQnn37l'pү'Jq.uQ,M)"+fwɅSZ}{Mt_4@r!/oL|󖾷G: Jsn*4]̮nfLooNgDs /H"Jx,2%FqĞ{jQ'p{հN570^5Sq׀*{3ap /!eTOZ԰FWN3B$A&BHʺȴjZ9;`B0uia/ X8ml]M[cg!8aʳy턑C2Rʨ_ZFSAwm.E&^ܯ0ʢ>Q.HSj0\L\S&0BQMyŠ8%Tzi?6v`WDMV֍K@iOf@j&9W,YF01 `8z͌#( Ç@2UQl"PF_oRVvy." Hs0GI=GP>S p=^quGgeKTj:3gǶ+j+ۿ_[pJTjp~-z^,٨Ay&qp$W',3npFa<,t,[?]e~z56AuЋ +3dߖppWל]͚r#oj_CD|X%C'冞ٹy-} -hc79~g\:?}-f#⿏sk}mjGYyRHW 3Ҡf";丧OW1_i!&siSnLAllly Nfgx fZ rԍwfh+EeiZ=OnUe Ҋ0RG鸟!0! 胐iAjnK" ﭖ&.E6@UNnVj@+&lxG* V8!HJn,!\ڲL0!%A x ߭S=UKJ'd  ݴt1o.mE9[`Җڛ'0#I~mCw%'%m)11Z")fk % SR>NN(8}!3-B1!_JcȐ <]/^{MVNMTԟ?E1L1f;o{!#N 0T"V߿.v4}>H1hkjK9F~I;64RWAR>_r=q~ Vc+:얶)wHcS9 |:XXM;^ÀtN&w\54-UyÃF)Rh~"th'mgJCۣ,EtjH߷x.ûŖM|YڰdmM;Uг?l됛a'- ?f ڧ;oS t\D&ԁ5 2x$>#x,;9'㝚EOQf;ZtW᱆$_ N~=#r-o@HEc5&+m+I .U} : dM2ODY$-i<=hQz,lH~UuuU=3klA*:&'P^ƛILevQ]F)#)KT|G"<~ .؁nH-)aMT50b9rCf1=sдb gBiE>0g [ {b1/kcd$+~_FVd! MPЬ,u Vg'W7#葨neHdYfD}uq I-oiAp=ӆ\ٿV+9~'KXT8 {! 1"ZGE"|In?uk&8py% @+!:n n9(Vmvg!>_Vk_&d76~}yr2 K+^wFp #-Hjy1.Pĩ*}pҀ%5 ZK/˪Da 'g3((0I@-̒Si&(aƥpt8Kgŝa H$dbMDu5_lalX3g![]h<$,a#1vD{f(K\$gָ o·f kOii r{8= ~Y;VJ:''XJ4Cxefי!%]Pyf#ϬwBv \9(6t`:W˟p^k NQU$QZTRDt,=%ch=Zh$178 ğd2!čeiO)8ė\h:{HXMeqО5?_s˂FQJau#3iiZtAH⯶=t?ePW]7,g84 !GGyOy_)HW4H'q?쀍GËKLӄʑ, 4sDLg;he;k7 '* k55B*{<3R 5n#.>?g'|}oցqh].ϕEWxV8_j99}iÎo~ʐv#7_76CT7,;Fg7*y㑦| lEpEQrhZY& fmCKHoo_f@fˋiL T.k2YlZ;#L߅J9 +̹nd{@L䧗 Yprjox3*;J(äCP6C9GAx0cnPO62d +k9&Cp܆>Ukw5AxdHތe&W'՜9ĩ3.ׂ{z A{/^FgeROj=&a sg6.4]u T𛏋b,E._Vd4ɬ.p޽2;ۡ4 TD8Ӭ`kM+e֚<͘!r*,\h٠!I)bgHP*D$P|5Du@p-֪%Mfn+C *{n,%:r|KԋSTa:ӧwo׸$ZwN!y5HCZBUͥ#3PW]h-޽5~yS2E: gޯGс%Hp,5pon4G_uC`̧(tdG- W;y0JV'_+)jj Smi]Q4q&辊&oBrԸ :3/O(* n8qbӅ{ɉ7Aie~#6~*mI,bmKQ &|ˀX21:95hvXfu/t{[lR'G@$gL"&Ke&9Eqg<2»nv} gf+S|W݁OiZ) "P|`  t\ƥ5Kx׮h`Z_"nU_`?ER P7JxSh**MF/n׸j4^ݽ.\V؈Nnz)?d4_zL1:X3Rfhhe3k?2C[b;laUS4 ^m/o3ܮPF .JihceRQi,m'ݧAV@aSd\dLk^3[YiS6@eYkݔzzi0յ):_(:( JXz ŏ׊NqqO8 ~se{hZ \jF,J@[~v<6?]Rd{Ѹ;b)Y-, { {TYLc'8sֺʱwk?F F eFo2ϋZnUB PNq8 KK'OE34hSoϓ`͇8^as]OERQ.% gѵj 42VX.ţ=H^6<]UNeiCJhNj}v-8ʔI3}2$Ӻ Y(n0(A,2eٳ1tah4-,O{mzo շAMkkhqN[WA2@/Zbk<=[8+6H9qq;VD**'cVd]c sq 쀸k[1^W;/EVM~1}3yX:YMŭ&ڊ!մX<+:emLaUmwK?2!ML~#RHNH1Upv4:. rDRI4b_ݵ/e 2,H:mCeP_L):9Eb]&δ+2 -BiXoi3ݍ.55Jp+EۍvZ>cK Wk,jM1{`O;7=pqpIiS԰7z@HʱitGˋVsߢٻ.k4H;V݉B@Z7.)o%89sA##x/''d(AȜ0_9etb0 سGGj~p3Ul 1ْxƐb%z;6- ͭ&kK!\쩗0be/{]J Ja$*0hiL0MTRF@ht& BOW,*]¦hBlJF~9MnW;#rQ'zTX%dGh@7BCtŘH z ؜4t +|M#HzD༳ H'*aS"6L?P&<k6z]"n3(ԦWJ=S~Z,|+f7e2 ߔ,r'ɚRUxX=#\ k RfJTX]w6#DUREjCHSDud`.Clo鸭H.N"6Zh#oͽ]׍ >.?_8D|WO쿬Nud5zTQ驤 69yDk]p0.w}ONo7>~~˓ 9K)HM!ɝO UG'.% %mTQ˚OџtOۿ^,׸wOy޾_\?{Ѭ {R,:9U&+pSb92`;Y 0+ I|ynӽ+e{G'l۴/Et;wo7h^fop y욓;;zubp{ov{/~;\M{kxЫ|o[G?@z;7§VLCһo}l]]\˝k>GAm\7۬6^hǯm} ;ݯT@Ҙl7~ntfzOCb`x1ۦ1ǔ_\ltIL?ߒ? \t/Vyx#{R~宻W0í[Ǘjpnܜ\z{_Ρun|} 0 gҊtwsy˓_γ]|oF]kQo[/;czNr'Ir>y~x~__.Z"Z{;o=\^~ucrǗNރexi{8'nnōRv o$օ~}R=*50F?E+~yhW~utyFћQ8K_w}])m;os? oy_tF0u>:Hv| no[eֽ|pn`v(~9O}]kH X!Y;?N7݋pKCS夓W6dId:+& Oјꂡޯ w*>q7x#|o܀^  tˡ>Pb?Rݝ;`18gkW80:o7K fvs .8.-}9ַ. n?mVQƨHuEAo#$9kBF=T|Uvjftv3\6"F8Dj@wϰ?CE|ثlފ4QeAaMR`-gc31g#g~̕CD 0 ;ͱ z&Z1d$3t13WZ'\\e*3W x"mE (Z8SaoG ϔaFVПZi_W%e &4Au"GMlu;S;R5`bØ zԇ}\XҶ{ghHhi-O}uSNh'G;>}~ӳAK+Q;q=|o0\M1lS{`\-؀fks[z]F0`@-aPM8tnUG>~IBb`O kld kof#?`fB$B>(I%O KWKd[ .z(,fRoҪ i)PQ7@h=D/7'oat-RmrrÀuPD!9Lyڎd/AO~B0C?vD# gLXZxV!.B([&Eb.9djAN%2KV_j/SCפ e)?8"SZ2+\ռaF$6ŰL^Aya ƈTa+QD47ic)& sa078S$5 Pʷپ{9|e F$F C^*@8-=J y (hLRFQ,8ޢƟA-1yRC19=Nxg%J\]\j!Uv]lܜCM :MEi6z(n:=3a<NGBb)j<[&(#BԀd+LETrI\nJ҉.Kt$].IKͮ$8D =ܪ A&֥ Q`I4fEdcZ0VnдR!Z`YBv ǚnYjV[2BioLMjh,wxFvqRgˆzRjAXuH*dh陠7)-N$YJY'X- \6<2EEFDӈN#H&h~RLPf։ љ LPf21A͊:LdvX T8+ea*h1ʼn92}"k #P c^ Udz%e+8}$D0|rh #kW[z0%[ VH{c d-M|w7O53@J+F(bZ3rw]M\#44}04Jġ%RjhTR*XT[yq""XEmtղF|6%  *1A5*J!*zi)WS"%=)z|g9z G=f=0 P&VaR0!tE۲jb F1,y4axa%r#N,0vh$~@y3}$k,j&s}u1yDX#C!cVP[ (F /$+p%\<*n.~I5[|^-+B~y8q*IiżocMKq5^I!:N Ք*+wBYgL:Kέ"FGl"CG]ZdGN4|D#21C>bLްJUDɤj p,PB+$'pHJSDy<\kz[b #L3/Rl.8. V]{HjC d*)$G":y;>/Nv߯BBYsL\I%1R'Gj6JSp 'US 8X5a^_wNe7׀ds+A.-> +yZ=䜄Srk+|蜍oh5X)JI UdcFh8m+vi:# F$f)0{1d܄L3ڙDf&:l$fI!WEViWϟNO',H  < k(#ޘu`# ϧ%멢JJB ?>Vwq9r\U" =,l5V"K$;/I<Ӓ5.CI>E> E`zw(?n+:$P~_}Jn֖7|BlN-ɰKv8gǯK籥@y(9{\V(~Y !@ή/&Ω<#"`R x S-0͡U%%t(w Q}|uʎ{ywEU@$rκ|R 46crL ufD;M8TPH l*3II`s V'(}ҤldTT\"ĜLh]n.Nw2,bS(А H"ÐAz=솔](TZW)\SBb[^F}  $4Zo~Y3#JRh.#|U/ ;}//|rrQAȀjA*ycT&ォ6F"(ChI}l}ījRkbA=IۻK=5)^SOÿ ­ 7N~uIC&:AD:$Q;6AIQ9UĪ!@5[ȵBA:GTq lxbY{6g1$`'oo~ TV4E%U $fͤeaKD(Mll KF|j$R4@V4 -b@.ܯ}Ah\T94䧌olE3؜+e4RJ &FTZ.N  ZuDĉjH^ Jbhvi:2諞=CX]&矯f!0V4Sɷ2DC6R$;ԠѬɩ?1H-zƬt:%4db)&!dqed4!E.׼:~/Le4@ ݻ# gG A"L5;*e+f;Dž {CRf}ð4ɸwC*bBbz;1ϫWqMXBE׀K SG>֔8 ybĄ8]Pb%2]ŔNo8w79{V7 D\@x=6 2 &^P9AΩp/q!Fl>dr+(.FqxS/'7Ws2< }teteo 矻?Z, *mR|}KW>v:>jW7wvxox1AU;J).:6?^yj8Ko ~4_>>;?סX[ݙbqs8Y)/ǼȘUAv{O :u ?J\. BJ,MFAɠ@V' 4ItsAAcc덕25Àr]qDP=Gݤʥ,M%0HTBh&"ѰT7zQf6"u;6@˷3A.N%`uXY;%}D{I^y~ڰu.7'l ea4n~L5Snk@'U}wh'ZJ\'8/%N'P6O0Cx/Qy۱1Y6kA-pPQ@j1 U,@HQ{Phҫ ƬdVV1Ҭ|@vdL qd"@zs{O%JgġR! 1@f·t۪S38/nُђ>i> t;6fpkQN]H%w9V4qM`@:0(ʷ|[yܨ}Ml[HFЃHjG$ rn' 9exa;Bf!mLFCz. ݀Qkps0ٹ,Y(J.nG V1WD>z}ݹi P/-szݽA5i&qF X']XG ,rk&$S +J\0U6Rؾ;GdjG}.hp9nyV{ _$ %kL,ͫFJc- |43 =lq]=}~7bA ,7G׽}-g_Aʻӳuuْsaejzsqpc]3 yϸv# bYBtp뛯>Ͽ~~|t۟q<] t_*7b؉l %ÙkЊ݇o!dzZzџ>`ny2GkϷ%]]!R Y}A''t~ C~7@:m˒ރ%WSqL jP*Rh֥mأvz^Zsu]WG_РRwm>f̣7C=NxPYsww~f~Oz0>㓇3[3 - k;5?R[JFj=Rs8mfbZ7>EϜ9,,PZ# 40 UR/ ˓Jںmg9ք*2A9Z}zܸEbR c"*us o~5wC Q| )^ơ~2ft);WGCXK4zuFnk.W&\0ܪ+)MpEyVz`Ϋsnd`LZ1:mϝ4%%n3^rp5錏ľ t4isn.cUtڻ0JU[I$Vy(F֚Mbxppk].i[`h܇&d-=Tл\Ar qad{`\fR0ÅQUH{Ћ^ccJ?$MdMjɰĚ8|vukRIqƘ%'䧶ndeĉ2)8 <##ؠ Vd?S2EGtAtúiQ=Kj]䙀9s%&(oYai#s(L(eAV2NJNm@҈3l|HpgЦ&ȭ"-?,l 0nS17ėe7`Lyb&HF gd+--o`z[6Y~HfkdC\SZi$岯h oF!oz|ͷB5k%Y6YLs ڈFpI^֭R?NYr.$i(ryW @qh "@޺pudz)CV|_fhg z Lx˨,mp~Á\δ v Cj[szz.)`귲.cZh%ܗd?exH!g~` V|)ZW @ϧ'?ݤ%=H&n#S')i^vk|{'zv8 ,ӫwrZr_nqbq'z3b >molyIq^,O|ߺ|Y'|ztgܕIvWKL}TZ?Z>^,x];> ],?˩L羫9K,%-"0:RKӔLw&}U/W~隘xejV2 !ϐb[9T~Sqx>~K迉,!<kdƭ7-{ Ѳj{)DɯwO;)>3\YLOʇr/&~:%QH&C$eNޑ,|!//~J\^H/@pwH,X+nPxޅ]{qɨ/01$}HC'߿^62Z6do!BSRsF:eF;OOַ}AYU_ x} )^>uy΅.af \%Y&E.2ԕEߚQz+-'B]r"ҦKT#[澢| }3d*alǸR^Q(tLR&If9v~ކcjKJ~G=Gguo4؎ԁlƭfcRj<{$қMki{f"+Elhٍodq\mJzǏDMYa? ,X5d-ɿ]/M|*~jzeydrߵS{p11s3&t)&W.qJZ2Ms>ܿ޴Ω 5 *R^E*lV*+u.96Ϥv[*"7m{VM|m} }:l۽ךhbح5Aπb ]^H(#Dztڱܥe,76IgiiM+%OorQ&,l|[3}?ֱ m ZVwJяU^FG{+1}%.:9l" bY;ܳ\Ub&iFf\f)cUCnŀ̈q30*2;j+lq= jcDqm+~9JqCaR;9-NicL q*q"ʆoq { Ĵ{(=x+KY-Y]d\bu[H#G:טH޽o_opmIMb iuH+R@]ɖ{i5 +0N{awqA ^vc"€CXALB+O ԅC'Q螢ȉ4QRcCq_5M Cla%515V?-wq>wq`\5ݎ  +3ia0R]ZKqe'ɜ<N!۵Qt7_[}]҅RIXƫ c!a٠Y-H G/wh|Hjqm=$lo"6^pyXnbtz=\d1}rm$uFC_?]\+VrŢJG4EJdallx_n.ik9CD ec;?PlHHܻ.i8\TTƸdu$s:cRRkC*yνWtsWU*21S(QL nue>pKJ`0)V`ZYmo~*~aR}_Wd)ɚ%JPM2AȼD!%L6/oO=ҵzv!dǨU'^ZHWȎB,;(V/=&V9E, ^/>" 둄JcucXuC\9i5cVFSoL$g'vbk +zvOW*R}79f̎e={X{>@];'hoGG[/Ώ݆=E(~f%}c|g&n26P&R=ț[͡5UuL7*ƒq,]캿ҭ0NII=擇:ܙzA Z*zaO`XrJw]嚻kIr+uMåKzpEԏ ͋{[>\ۨp)q7U$\uy)ڔWgPD])$֏ԚJ8`Zԯ0AGRqap<*&(J0 i.aNЉ2̈$T$;X۴>+QMv[D]mPDL8HY &uK@o F<'5{'x +W^.~#s; Rv[ )^PDӭPÃ"AT숤_ilO?" NGBH@K SSSF@F-\|pb(^uSc{)p܃PZ6 g*B }\jaS9M*P f N`3@)RZި݄7?xrqFϙ\2>dltc:o_,@d-W[ ȄCW{m/l0mk>[ߚ|^ݚTY,0mRX:Y2/ZRh"NY)B/ިl{ׯ2`I?%5َd *uGq,[lDnkU19Fj0!#V J1_Aȥz,[l:Zc 7[c4vA}!lo2HV&NǑp)mOŵ;\J'W lY P +T"%~7Qnݥ% ,Ԩꘉ;!Ck)c\!)/W{x :~E6/?.*<^σ_ÃaG=E鱣ѫċzAdl^{L<n vDc--6 5ov6B/mKK4lwnx60cٲ:=WّUty7 t޹ o]'GϾQ >Ӯz!jRRQSJ1VE+6"s@u91lXٯEZ=)1Y=V~f kګ9!H!9m37rÁzMl3d04bp1z VK.y71nJi9}| 7j%* I *i8|J9\i1n 3I Q$) ?~Ke/PrZ*:9#vc^4M{ Z =w0 @A% ckgr4=&} ½_KSKI2k'Ӆ5"TcEwr`6#ҟ Yx@g}nc3~Ƙ2ǗCbWZP.ZGRs뽮%3Ѵ+ Z2E?.iUXsrD6[ZY\iOYԂxI~t:߯rϴ;`ŏ?F$sL)l mܕ"*-Doz ExA6a_QD.!H`gC~=J"f۔Y6Lzմ[m0^ByģsJqmr)% 9jLL9N/3#禫uϏEz6QMBabۚWsm~"B~I_=ۚ[Z5V2֓/V*uẸ˿V S7EBueǐ9e5XGEY<@F\IkvgV R\>;ulWH_7!w%mC5>/I{I<4eh2Qr DZy/s;osjMuC򍧣(du:>/lOxjk׶VϽ{wd2,|G36NY\M2E5ڦ,G*tUtuJݾC7#^Gbojg7RfW}FfV!ZKPC77Ú6?N[|3UU;GX)Xʁ(~ɮm7}M&*#Cb|Ǹ6rDofAV-v%LjH5MvMxh3wu4dK}%L!>'z 3t&[n:9kY5z(_A(Upַ::}(Y&Ĥ!%!nt⑫#\$ JT;EVUM|foRCH~Q//",]羹ye#+aHeml ZBjF`uh4=~4=v=Le=4=ኢ*xRvx|uucJBhĪȼZThxU8GA[Od^qvY5UWys%R7V*ifheަ6b ĔԻ.ݒN q?vSG{4`W]F}0qH =cWW?kWp͂W į@ WW `0"JJ _>H:Ox.HRXbVPQQ}v}כ(jۖl0=qjJ()a[#v# K~- vzzAR*Y*#Y(b︕̾zJJ.(ݠ5VqhcAU6C_IŢc D* \8#Z69EZ}90# . -TV؋4p  ak!pUh ę7Z{WM7Ui44n#Nie%cqJv Az9 n%HkREn[HsFN?iDfj|iw^2i"P56G3m4&|ߎT8% }qXBDHQbp_Z .+>LîF^fhwÏaa_#Qw5攫8^q99>AT®ODH.-)fz&B h=!!7W(Yg( 28JztU7ckvi(M jvo#ΓwdO8BXc$tk7$bB(rtqo~]W|_!ɖjG 5#;jRc `JFٜ,h'7n\ <rOW#ϡz0c܄9/^%+Mm{8!G2Lq~oWP7u:O^Q!x~ǃ,9d ʮJzaSZ H% φ0\}=~7Ljw54 ׻lcrNnj>6s`Aɗ cw~>h7'鲂LHn;I/,l:N̬NQ"VeGsW#E]' ԝU ӹ9[[PlJtN=W i" jbOf(Lh-|tÖQ|;`UKPhQ}d.'4^GtQ 2߿W:jK SԄAόGCsfz au&єIaKCrHibJn%-\`>\A5@g Z~k^O8a/zGulWӖI`ۮuE~(N[ 9i7igC7Hst2c{7ã_SN'Wsu^ߝ0oKa|R&:L݁O)׿׳?{?,ub!Xǂ+ Jg?UW:D W<**y"/9 b wu8t~PMc.<9 )&6/oM~:ѕ]3go^iYv8{PsG70}J3ss5`+zN&"v6uwyb3STS!ϹePV/dʮ+w/zKCz=Yj`oy>qĴux=BV8iҮmG<Lg/0CE?O ekW黜 jB3 710WkJ9(m`A6b7BNP4bׄBq(2濥 @,C@4,,!!e ,Bp H+r<;h>67†S4b8EspS4bB2E~_RHALqվ \*<!0HFOޘs]1סC^^*<ԸHie2sJÀ {&:#_ʹ^F[Kivc1ӽ:=bgZض{!Gv,zv5os0m7.&d1XveG"O(VV#͒- bqkxE+^s^#řvEL. —>bD7✻gUWטD2$R>x^qSs^QFw!Z\8Ɔ^qdubFkp{tI) s::FA|6GCmԳ,1Q^1f =>?j;>Գ.9tYp7ට`"DJ{X!|/* fBq8 tW@g?%g1yĊ|+WZhYgm^8WS/nW ^ha_3!BQQ퇑#l6&Њ tsiGрxzA0+I^l]+yc*w)6z#߬2w.G` p/s&z?zu.?'PciSze*x+{gX%2XNŕdX y3UC(y]2(@^.u1巀|xbxbX!R&{YS` ->-)a px=a㻁 rL , w#y'Λ[~wEY_0[{tғZ\[0'lv;[ ) p/ɣ޽ оU%b lrAh?I;0foґ!V?qZ`""3nw4 <3ȣց '_IՒju` p@9g,>|FF@ d$B\KQ&:k;9DI/쐚,?mQymHpءs }Zv(Ŧi}% UbW-snXo0n`GRU#=[ȫ^wl!"@|bϿ u; )wmGWKy0F}X 2,O6yp$Ge:LۄC6NDFy3e3CsRr;浿 fm[*3\k'!\gZbSZnus-&fkj{ X9X;:j3zR`"C,JXiǠ[ x] 2uM _IUrF\f4ZqVU %  Xx j-GPG CX]"j]C;YiDq9T-Yf(qV&Mu0m IJdR:ASndk|FO?(q#E5 FoW?duxyyyΰs 7BN?(Ύc}rـwvZ؍K9 YuZ=a1a.U_Xc^|O?,ij&N^]>^twfTg_xyw7¾#_^/4BMI7ݤ%J=Dٻ^}I%ʖP% )6(H42$Z""haB&T*Q}k@nu7$n[V?NkL@ iծ#Z:1P L . 2Ƙ5SE?I4RS%s93#5miG"frX<{tQx4h`,!D84l@Am"nfZ e+`!PV'Q TU,+r˵6o*(PP}TlcTn>Td8+!y>k8v :BߨTCE`9/oo_/A|ÍsќFuC#LEyP[P&{:sy t"m6d G:U\R5ޭDۤr˪TLoULVZ&X+R/E{HYOl07HFB( QhT]i݋f{$_~ng^} # kogFur4'm{αs,-lJֺ` 9/PV耨IbjpJU̍ 9|pgCAYsbͥ"u縨欱1CP j77F ,Q3E12N db5PP&HM)`?NRќ s qv;6cu|ȾhāZ,5gd@&|$7l0gx!8tswmʿF |8L{ww)O_w۰F%m|d=wBib|w/dz~zuȠhx<oK}1v;Ƹ{__{".1./~7u?5ww ʸR/̗㇫+4_kGsKE1ќU og14rRmcM))F׷J$)1 jRWH6`p!=hv W iο:p-JB(5Qs]CPa%{S1!;$$! XpR}f*fINM%PHdx\a(Lr-9]Ie?K1D9˕*x l|koGAS-&3]1g$0\:  WYmhԺopS JA}[Ajeؚuуubx g;u 0Ƀ1{E.nG*Ś @_1<ʡ%PoJQ#a[pU[j֢lQVϩZ6U\Q@斪SmY3s9*M`]Yp1Z:\G K憘i@ex,TwûN+4wV:wJM j{'56Ѥæ4jJ3,+#20a.i['ed65B&kw4ڶV46.gc׷{+l4,PWfK=7&h V JTv:q[y9;#,\(=/;AfhӘ"݉JCՎ Qf6msm,S#2J^Yɮ6"(oIa]5raaOfx>4B?bN4LHw 5)̔$A]Y0@LGlw $[6r2E'c{3҇~(|퇫/~}n;go(`Ibj^ZtАDl&qzr9f-k̹)}{S`VmPͅ>{lg.7 r5㷾<Eog4}4Diժbmk^G$9romE4 (ly0tT uv33-F[֨JD&OmC ; lYȬ8Kq$Y;g^Idb] \"@ C+>j%@<٢ *:.RJ&B) N8T [';"$-7`K)V 3MI!h6 *>Pf=hyyM? jM\P$dD&RlwIeDN!|s~7s`} fUT;͌۰foYTn21v}[7)%(n6iK#X-N{ɗMO;H8Jw/Gp.TH`y&)GB>:;3@ǰ{e+)e߮gDnBOtɗXi !$acv&29Ž!Ndwygpg]枴6;iJk'm'1h;?'7?[}?v6cBu_{ǠgZ^^0m]c>v8&89]r^ 4GrՍH$?rg ΝU?]_ۤulWr d)=uR_N+_,7>[oo{כW>Rj)1 TIR>"^~ ʿ>\a1>V?:A{q~xs;{g73U?_wKe=8Cl8^]jNp žkkZkL)*b1m4iX'L!3]*]넱:c˞g P7?#DZVo}Ɯ.eqH;?c]{Tj/.4z?[Sg.+ŧ#;i斂7mN/e^o7{E]7/6^1o*}p^zioK^7{EpIYoO:g6a~3?-HvFoÕ^ܡ'E?h:/:mU*/ >YxQ`ӌ}B[wہH/steisq֗)V A< /q˚Њ?tqP,frHq\|oWNY='Sh/qus!x,`K#nY^rvGƵO7ur*8ټlJDPg+4HJ, BJf*eq8h|ht7W|kX +Jrٻ1Dk RN':/x_Px'M+:5n)!p)hC"o\MtsZf B-f],xy-)uC,O>ChrWB[;vߒm׹X(r <ؚCI\!{yiz|V`OA}YRO5c_.s"2s E_Pqwqq>n`s=1~ftr}==v;4nԺ3E%WgQSG2-ee $@iѶǵ1I*qNȏ?W!"m `LVCTٺ/ &Snm EtB($2bAJ`ǟhTĂVzj{s"=pB9Qy)JvQ# I֍' ѭAx8{{}J^OgӉ7gNyAfyˋGvh *&9#co7wԞ5ڋkԅAEhU=sOPNn ~|w{~>ګ <Jd x hFyMz9{S:_u,:!y҇f.!GEP8!@`"N#JZH3ڷCO֚8t\r&4#$}0j9 Hv/sٓЯxEkH<q'#D Zx*W42ägGSR/el KgK,dU|:h|o=ӢTry!yۘNPq?ʠaxR N<1&;v=JP"iGZ7.$OtHk4۬~NyvrG(zKPdTr.רYVR($ 4h($@#ren<0E?NjPlj@vCK(XXl hb[)4 "3C9cmC{H)ͨB;yWIUfGU)}(#,J`N~wø6[ Yyὂ~*VWϻSqFRo߷yB_4;L ʓ& Lx%OeԂwԎ_?*o93QZZ!*`"TW:T :v_iGesåis2JhQ .` dz1cRܛuM9(Ejh4K,yjras.Ǧk;G;yq>v/CȾ{QػO~vzҭu41BR2(B^&fFPFLZsA Jq pdk{f{>&4kVN xU*ؔTD0Α8) 2GI&ZqXo>T>c .+2>*"ƃr6^4w Au0@9XV8Y5P?mpl.-(.>eDjt;XzPA; ]G=| Ǵ&GU i*`Z{k~|3|fɖd FIZb!Z`8˥>7(=d)Q.$vi TzgGT7Y,chZզrhMUfLjZ"EH=wdg|"^my0kBx\DDQ(_bsG@Ǖ/%LӞs(+)W׵|J!.q2q p-U+55 ;f9DJ{rF &r_诽ܮ%oTlk ,ZeC `Pt*:.E(&R7l)Eh4yݮRFve1E#5{sgx+ҹ:}6ᨐÁKd{ϕ*{X*C @QZ"GĈ'z܅WN)GTRQ}f!&&!H*{q$"r!}H;HV+NK㹚 .S "``4wscZY]듷I; 0(8ڠoJx,T B @OcIk( &IJ4)h:/PhŌx;֜oA5W}A\f}} ڃ0xīW@:|MFɌ6pj+LPpƽPlܩ]jmE`KrsUghpX*(g{.]/gVf(PZ*fd79DH%$|Ej gʚƑ_QefXƍ#ev=;0/A`Iv}T,IIv2 tK\F1#y!lB5[8[EX)0Y5rTJ 4$1%I=NJ(i#hCt~M;ˬ&{: wIv Y]#w&4k{vR_fcw6n?Tm~>Ϛ= xԍ(Oډz=t}N&`Z%M#V.VuOr"%Smm&)GnN;h#-5i쉦j>$E4K8yyKqe)h\ RD'] Ѵ[DS[r"!S@XL2DY",-䎓{ہT q5'8SN)zʀ͟} l oMjJ젼Y5_Bd2'l)`$Ggf*sK8YrvDY%YLG]'L\D8ëu 9l&yKݹ̈>6M&`@ookɽN,i޹(UPg۾~_cІbIȐ܈s(Děb *ӷ[n'0yO4 ;y C;6"[ʳ4‰\iR+|aKq\޽">vZ MwKm\W.J= T?%!CX* PdKطF|Jy|LBqvD穽!pDb41* R a0 &5 ],Em-G*KR`XQNOZVVBբ hZk\i~܂4=;ҏ,Tb\m%&_M.iBPNy;?u'=eu瓐ZH{ѺT ]hNޙ>: qqkH#!DQIF}Rǩ~a!dLKC3U僦y)R'D#jVFK^hJ}I dNq*{85XNSzjWWZ3Qܒ0W%'!:Ͽj^8fvuEm8kz!2Y|v B{ދWwѩ$b+1OiߵYKR1e@+f.! F*!3ӌQE3$E2ׅ< }B Yq~nmg*_)jaN׮Jy U)]8~RFTԥE+9*= O`N35=F^:B$.҉J{n+V뤔Z'OބJOm5%dc[ոUqGR-UK 95~G夒PVol /?A\lȁM0y.mm @;&nj'TNඪKje Z_5#٢JbBN)@neJM ~8 (^Z"$V*SeH+eB yY#L(6ų@krt`VCЀɈrRKeEwcrڒe)#иPD+.4E"šp )Q/Ju ö́p& KV",ҿ=ٿ @[ TN/R'2\?\vxm)Э߬q_NnS|M_7Rp AFW%fDoկ=:|x3Tii*!kWs cS 5Ҟs&_rs[3(@:܈*o}0h {ȌITB":U"hԺJp*G‎Dz'6!#z}}Zsc 2:t.ZO ko#y%~H:gBy`&A` $ q9?^¦͢''}e{ZqDE 5P:zKD'B)gF)"+ޮ=2crϼX z1K8}%ôxzfg,@ ɚwc66sgrIjyǙ3J+FKή+iHHJ:z;@eX4hI]Ic'Dq TJ$HaJH4n qtfm3܀ a< 7 @2n]Q*Fc[ FcPԁBvHSxg[AkLr2QdӒSqVtna $1MNU\nr##wPrPH.Æ*vݐWCuӅ.5X9K?J=.ܠ "8WZ]y۰!]ji8Hһ_=-GIHv@3`" tLkWٽe䆋S}lzz@k†OįsKZ-eDãta!7rI}Y<=:3wWK:Gs_FX:&ztgm/oAuHH%FaҔ*$Z"> lZ\,^ ѨȤ|uofH˷Egs_#p)?`8ʇڊx.-PNUV@ rYJHb5) p1GߛXSS˭gMD7< e|64W4o9)0w./JIst25PqNFV ]}]>=KG$bԨsxsuiTua1@(kIRs笢@+0w[8)jɓZWi]$ֲ.}(#j>P2;ͽAؚRWb=Ozx%γX_%FCg.crJErjXvLl2[kXvz,vBHl1giaסK9Mi堧h~ ;6)bY|" 3[cuH|ʙ_n1r6QTOe*FSP=Y2tuAyС7-iTNUE,LFmsH69k%\7*fkd8x9ԁ$=6H!@>x= S6^tj-8vPL~ "˙,j|oU!l$` b8EqFHhbMث #^yq {ʿL9{b@n|b_6Iy=R0y*l7^_ dj-d oѕx,VG=#Ԅ`oZ5}1Ff51`Mdͤ \O1>!~on;MfdH&^9yqoIn0NXP2 UГ]yn.ԗr Kݶ5d|kK9q-obQ/Gt3JxfdeɆ+#oCۢ`t:RW/m y|[~i-+HZH}b~ ?Og+'s<%FߐИ7&#/Q>w){ j,ǩBW^e\KQjvR~8dV$}VH'RE4|xbWY𕾗JxL"A!VBB /HWx-GhkXF2Yp#FDC}XGƗĔ3s bXq93PJ0$$qͤ1;=3al6G>k9]XKHKЎe2}[v#erSW4?20$:P#$SVVujQTVWHOp[k܆J JiQTVVךr@p@->$,VRYQ!BHcF[5Ikd#)::7}CegטϏ|y?5ݒ5 3 2 H06F{azp/N0TTB͔'(NkД l̴Ƿ[auwڼ6ejҍ$[qҐk ν=@y7u;Es-Od EZ>#Զk 2jցRraleףl޾vaV֕V{Fg<(cr?okcɀ]s#g)3 f dw^/_Shܗ ['=ci %Ͽ=t^s;Nr/Y]8<`Et {,V=e. ŏt_P,UQ59C_4sK+Li&}"D396_00-&DSɔ? .kߔrU=$/FT̀tmhF_Ol:/*e!jCʰ KVH;Vw1`S^ o|Db{u"=*xpB" r7DdgF,$5f^'ȋϠQ8s/MY6 b!Bvt$7b_:fdEQ)8B3'Tޕ4q#e.];5sOaO̩;XE)>ŷ(ѦÔ^D"EI!xTSI"t; :C[VQXx}Gy.8xP8.!JdhB'&.S1%gfmf\WCӒ'RkI*^d9 JjF}<(1Z#`sgH0JY;# RyGd(-~\SRVs|.r/D I ("JsoI? qou %`߭YOB-m ը }@ˋq%̘vq  bwWqxS G]q8U\Hzu phRS#$$"H۠]F7A+J:MG$i͉Z ݿHȥ)$Cb zYLD@*48;::zE}ݰ~ ;fˉl]SHe;@ A/MT@m@ @%R&Qy#Cg/&gBV>|{t \|c^ Rh>WGVD86g֢;8O2EDՋ~G^irzrTwx Z,"*&[a#/J`i.Xeute|$&zej 84 .9SІ>ktUhG&QJ=^3cWe8 [#qJQraTpCcy2RٕԚ>ٕ|ʆo\ZI*NCn8Uc$m\v_v [NosCj`!=Ln}[/GO*ݺ=@!'X* |I<|JΓ3Жn+PH-cP82$ 21Hb0Z.'!hb gmG֠Q#48e9KE ^;t=%!2U(ABR,2.HIe)(gkɈy1C@3Y,9!Q-Hq}Pb%|BH$qO#vT'n2:&NC@(qYb g^[8*ţ#[}H E%se8F{#fhkei6hFU 9gqJS3{Fz YFNDiT4TD-*2D%OΠuD绐1\;ʄ\\/Ñio'⏉[uzӷriߗ@rlOra!ޫ'p^_i7b1+jroۿ??m ]<,}\|ߜnWNOJS !?\ۻXEߕ~z/p|ஹeFB/"Qivzʟq뙔J iu/AtY:.C Aw Mk@Шy 4(b0@VgRc}Fki&>)Zj|;R4ךq 5ds.E葑VŻG\Ynz4r} 4~?_:k'=CtO?}t?O>~ i;#/W˦?}}.ɝxx"*_r=ڛ>iSßo &. . oYuu:M7kv3k)EO5>hмIPЬ[Ev>y5_KĊ3S/(7]΋.QuA=HM.R!KIHjR@9rqy=Ь t/g]B%F]rqwN{cM}3y&ZqpcI HB:B .n?_zNY ?TsI{pnZw.! 1EkgLҗB; º s5o;Q5eѪ0udjzgm'[bCTLk_qi;Mj4\-QZ|eWwo>/~FY 410#D1C?s곽yuy^^*JjV &!K0rPO" C5y鹹#fwY`7tsak6m\Q{ngC ZIU Uj{%l5=Ud{](~7kXPfQwgQ!&yΊİ&P% w0H=zV𦝎xl>]&GE` `v,s_^⌠d׼P gZ20rN4y\3}c##[=竌0_WE+#c{giE1F{<3G&k90U0~iq_ykAEwW'p[ 3"9kǽ .Vl\~b<RgԀDn*̽ !wx$TEH^;ΧTZτ!rHy5D 9?3"gML2l@Lլ9o%Y Ĺ%#c{i#>8I.I`( Ix!Eeϑs'I4q吹|?ojrQ;z#gӏJ{Jup%RIc"MN{e Π1 P#‡ji:3Y &Ј4 ýv/L1:Q\+\i AW y5"*j:`r'y0%R1uG޻scrU+(@ɪMipy4[ ok yE0?j=_hvn^V볅F*U>{ -+VE{xB{/W-ta҉ݕvqɹ{N(3LmpN$;E]=:aKB NPtSM&)T~R۝LBMQW 0]}(ڻCl|o؜$p6 %J2ˆ1hf`2kTwr}옢`܁mS+ii W/R^Sb^-eC_=93wOczLXӵxe[[\=Cg-m8p@W:h7d!pxŽ̨ 0"Mj zJyCG$2}.ٖ'ׇ1ͦb &*ŝ΀j%75ՁJzU3M}Hv29YKqy')ĭ5paJ6$GE( 4SޑላcZiQtԔĄ i-"UA@Ttzb#cG%A*5ȃ0{O ^LuњS "2E mWZx\M"$(xE< . SHVc=92N ].GL{b#Qu6Apu0-whtYKEaUHÿTf\Vx%9qp$p%4+$V4@FtӛqTs]:{CmZ{S׃8-`@+Un|(.7Ѝ4"mDh)]ԡ&$Z :pn~Qe& H'-2\n 1G0igUؐGT]Q)A oqFG[]"aI}T\^판 EA#aQAQ(~F`s&8*x3kɩ!G&lwfh3^G+- [zzz*lT3#[r鎂3Jeinøڮ̨ %;r_=č£K'* S1_a t+[#4<֝\Li͛Lh1$_ev[e÷/k .I=tϊZ"b՗ߺwZ[uiƆyQW+= bc:ؘ,]pccX([qdfaYŋ ϙLUGʊc3'gi QQ&tnst9zr {s4w C:{2dH${BH:BO;=<8#6p$CMJIVnK曇0~oVNYhYGjylYM ۼv(EazSl /T Be XC$gu$@N@yἊ1hkeD+)-vsԨ!A! |UAoXdHDu$PR\miA0AwyDH<~Uy M]Xg擀|5&W3Y@^60T=][ZQo=-0*t*A,>VY^YZc*i uRBs´ ճyzP5LbFύI7яMδ?Wt̻-c3iT+o[yn{|rv_|R}>=qw#ģRC1:*qlRYw?nH_K |@`;n׆}~sF"i;G^=Fl5jW,VU,X]HIOe햻ˊ%5*&jΒ ~%ō2ō?gj 9WqSi͔@s4 f G lϲnN)Ȝ xFd`w p Q- 5X+/A+FF/#сM9L (8zfj(^ePX`v\=6ffnr^MGc]rOuuc>xl1 ?]j_g/VO|JXt/wn\(hhiN~_K7FZi 9{yk]nkLgv+|+3 S-Iw.E2%pM&~[)rDt&VK4صv+hvkCBs]))S<$$_Up,tPp`L gxQk ~ H $x' :$UYƱeG J QOxMXuR, ViW̹crŎS;rްbȜ8BHEX#ׯwcˎjl ZDiҰ?|QrYK X+a翎9O%۶+[g{xnbW~5/oeӃd 'e`А gM[č UXPI*A'$i?}=&myUIJPSXVWP^-i5v3+&)e4P}^mޔ\0 :6d'5$9\I=CNh\͞a"Iҭ>XV" K!:6$Djp<}dJPS`I@k d*KPIRs;h' ܊grd-ll?⌶79oQUlH4q}4>IԺdhz_;3;,g@Cl^9"gxMw:WK*.ftɽ..)"Q!Z,ŖҷH?{ ң58J/n4_:|4;7:>[U nW(šU,ڋ  ؠ6}~x[9XU?ʜD R6!Bh$1(.u5Sd>#柶|j ]݈ 4Iᯰw `͝(k< #|(镢UZF I2y˸2H*F}8:!!NF6bc: , KNZ^R'͵fg[l>O2Bf4>AKPm/<[cLa4={EiiJy[wvs)XoЛ qW 7. vslX;h<JhǠJ:(1(Z'?.l{{y`<'bz61mxUֆeN 4:ϱ"\ lCMwYJq!JKČɤL?1yIsY uA%}نfW7jMFl#gdUe\Mįm D%]5ُ`ydv7P'yR$_&ooikk+fzC9ڣZ}rb=VL!ewQ74?Fֲ]] X[OmM^z|Vr~rEkP`0[8j*]Y"Mjcdkp6;Hg/OwꉁA ?bV#u^{?n۽䬂w= $T˻QۊQ[ZV1f#'%QLFK-a *1JCz?Zq ':oS )m'%;&'z-=;rhB# | cC2~߰q /}YsfH:_9:%(@-?)O`9HXS Z^CNyyTC&L&sD!Dq/ +R1J$=r*,VK^!E I%C\mDǿ*߬R qqye;`=6`/5%Kw.E2[n")t@Vʃ)v;)0MȮ[Ds[hLIyvCijRNM!]MO4W!!߹Iwn4LoúD Le{VStPUXз˞UfU2iYo_q ::^|OwT$i aUƒC 4A&fZX^[ :XMAy $z85_~cBb0۶EJ-F-ɕ[]C[<|LG r6"pR8l$$ұf"z6VB(BG b!%C=K&8H)}l xr^r㮯ᶒY/_\*1rR3s}qzMqY۫kx܍Sكx\Ž\ևvH6iᐠJX06ռ >cZ3:޻dj,>ar0y8[gb?DjgCaa_+ϒL9>5-%:xJȐwR* p\13FBi!]_GS 'W2bGP傀bb&}:2 w sq Z"0d|8n 1Va$^UgXA5[nSa 4B])$xn HZ  ca0E2X3+܅N1,z0 CL0]vXUƛz T!9k,;5̊DBȖ(׻{ɼW՟_جdJMHsC}7tlQoFEwݗ#M5%O FwTU#;xn2PN?$a ,PzHPOa$ I6,l_ be4I.b8?/G_ST0%PYROfe{9O@P7v.Ӵ`ROS0^kJ@C{xîZR}vzYݟbl/[ 8!wF!t_h迾ZPdzcKը=7vu7TA+T\IQ)s[< ;Ynac>?X:K4οҖ1%}%74}7b?x͂fu7 ?Xu]uЦ!xxrQ>़ ]U_GDrԷҽ?"RR3'ƿd%\q9YA>=s{nֹt .cBqe0"J2/+慎52٬#gGna=/͋4pY3$ӆm]lQw;TQۛ&n!ɟ`7 9y"D;T*ץ{$Qt== sr#X j+ qךK4dΨOi̠y!|_gA3DR%<7"CL*İ3:|1m ^9lī} $d@`e)߷$-IâZCv SҮ'$1v]NKx79CIԷr_Ǔ_^0GD8BD^nѹ^P$.VB2ғO#QB4vJhďyO2X%tR)% IO>)R`Dy.`/9*UB02e% ]x5i\h0~t~)"<+=b|5lvȏ&qRpKd(@ 2ߗb^< #cH(m@t:T?EZCn0:P 9)H;orršu PQbJ(9Vo%f2\6kDr(Um8/EwA2Mk# @SG/ȏ/p Smڂæ{Uqo.:)G㿞Vwmݵl&nlEd6$ pFG4I]A-8Cwe0SSTh:1sPEw8؋OBN%"BƤ 5 ydժ8m̃i$ij&!8Y1wd\A>X׈ 2oRB‡{npO/Tyy0ɠ'^.RwI*N؄tWqI@2IڽHChL*OS0 ibkGB=K̨;`2hؑc3tZnG]'׮p?",Y'LG. iխg|=|iKXd_ޱ?G#wjd-'yl_O?L/%L Kϭ#; WcX|m{$4Mr\x~mrhx5ejWp!=tv tb2hϏgSnR n z.~P}3Kzup1ydW_^V< ~k?9OYOfy;q<-ns4/F~= ?`~Yl }w—fltR6}y@Ѿ=|t-$f}]t8t6泛M${ca_i4|9zr Rc}| WNA_C=d4_> uMnqZ| eyy{/s3G#K]ޔxhr>,_}5ZL} ׿->aT|f`W.5|sBF\G/s0oq韗O(ظXdq./}N:,3{ fz$u;/nC *?B*+2rTӧ`r]^o;rAۣÃ(bʗ'1u(τ^C<O"pbwu~~)Lg zsٿB_񞁙B;L3Z56!4L;'ךqZ SsvV0Lొm5\י[cı"مx.ؾc~s/9+:qoEǀ5qP ҃)ثX$FkKpJs-vvTɁz?^"pD: ,sÌ[ANxI ʴ)U;U;\iF J,-?D8Ǹ̜k %;pf!C"y x{??驭`7o_^ǴֲZi-:X 4IbXgXY/RI]20gZ>ץulՎ*ت^2b/"qt4AlH]/paGjt'Ng0,m^7@j}562bJ$I3*[os  dHXoAj:/}#a*I>PF:>:g@ƾoS2_Kp1&٧²"TN -(/<;qHp`&xXof$(az#<JgGΟ*PVVpmyzWťx~"pxkͰjR H4Hd$7$I"B>^Z/3O)V}UR ,ԣ"seߐ5#=V鸏DUFB#sl)vo(LO+Zz_a-`2Oil|8O=''3l $C啑\[epؚ td'A1wD&H@__Ӊ`W3Lje^Y/ʯe#'/UW"AMB*Sz 1bDQl4YO`-Q l)U(h Zq! 2ߔ> _%!Lh5":EJ [$X=x|v +0"*ƐB4UL^&-')2sb`}8Ra Z,L'vYD8Џ"oA1u}˽~`eD+loq%U_ VcQM^E*13j ނM, څaC砧ޛq z& |)DY*mJA4z 5zZRHUE )wݖBT%x@,@) 1ȤBSji&:"m&JB(:足TD6qy2騆xKv̩BXhJnj`àɂD`J1\e P@b>yb Y0R5Ϧ1*&A5pW$"s ̰@a) QE$7x @pl, z^x#,M DFRWJ gd h#Z'cbrn X>"Sy;Sy>l5eߤm> fbj,k̆+E+ٰst+s4r6 sOi58a] :gZe{;{̥lzfs(tAޗ,q`˼_e,*N%x9;7mԝ Դ/W~U 7q.)K1]2 {kU\K\b%%iMlNB^/ސIlz.Ipk=uȴ ˫R4es.s]W]T"U7|]^b}2osq6{k8l,|RwG$,bARwH $,kARwqO/TyUJcc=R.u~ni4 c>&L^1d[*A"TeH?( U`G(GW-vh-BX\fiLENڹs Sȇ)8(hζV7($֢9Z9(r^B A(lh|/K[ft0ah{|)4E֭=PTwnv4E֭ {۠G_5h޺-wcՈ=_ge6^f=ش^=~*"r ^ YNX$n~zaxz#pm>9kTFz0J,T]ɱ9NRi57>"uJq&y#sE0F0* [HD(\,v+Ƅ牌%b1!&D`aDVCMrѨ^<* g^f S-6N5^ 5*g8F]#țeʳ29_%7T`Qj~aV:*+f<$rW^ EIeʳ49E xkiFpV7 WYFln$mc(.EV$Igt pw'FҘe2cbݤBYh2=s6}f0ϒ_E뎛+mӝKS[yh$ tEAwlˑgFKTi1:X*~_-gs.M#?=|8K{t%1Jo3oSŕ^ݼ>eH'n?:7'bX;Ctyyq}^8C2-ӷO^~/q~zr.s#rphW3P_>]_H[UY#M*K}N| wȮ# `iڦ\Ww?.٧y"^Qxt?Q&iY_ӿ*+"[7$fż"2EP<fYῐlpiH>RRtm$TX#KѨ^8IP"9wܡ< !g 0T54 U .2J=.Ky'X&,K^4߼_1?xb/+/aM7O_ݦW٫G*T+9kc@x3#owYzq.>&gAL@]G6dԥ@ͧ?k\UpsfWed^ Ϋ';aIX^^pb@r1 ׋]hM'ۛ-O w&]G+*Lot-: eP) E+>w?to'uKCL®%ռL wz*Vߏ&EF(VJ1[RtWG®A@RAJQN^^sqގՖCJ/JYv(lPz 8{!iŻn[`V"txo5$Mu~nTVdrrp'ng\<+G/]~ȥM@an>ҳ7+~LoN|vu2ˊIW%Ln6]vw򴭋@Tn_o _%]m6/CTzj4Qv+?/Ͽ1T0k5t=5agD+!vTWRcӢG`"wVߗiu4Ax8B3.l挘N3@,8Jk Q)W}JꖿTק΀]ҽ8OOr?[."0|>6|d:27bO6ʐrEA] #ļ)YH*q!. ;hl ۍ6J}`wbpOw S;e}8S֝Q:e5cT |t׌khDVȰ;$RH(W>sҞ7[ʴo|I}׆5&#`҇6% WUo I)WbN"JKC2$‚aEuڼO)eHS/|nMާ,f4%bkrvE[bCF3ШXEfbNsa#%K+q+45^]دARÿ"Pt_l }+ſԤijMr#NgѤ{@lt)[7E٩esi9 (gE&I0 mj~ brh|^Qݭq"»_W>H` *.47ItQػ,{\O:J>ِ+Ȁ? բ³a˄V7kQ-H/1xQϒ{H"/w^},g Nb*"C`B2!ϐchUυK|ao(MdLs7d_tY"(M঻7}>6ʹǪ6_uwuf$SQ#U0A9äMV9!k1JC}Us:vθ}W|ʶ &6ܓͱ 30`+NLZWiY'>nnbA-\[fH}Q&`N4 *K >+{%"BYh"(+5IHEQQ;5z%|Ss"b(Vk\Th5 'KHqQ$#/a>dA4.ZlwIA-M$`LMĚRJ[źW>"]!?V `DӞ˓ &fw+j)O0H52*Mb2E 4+:V!!WҁdW_T~DK3Kakq.i5ܱ6"[ =^id],&bݐ_x^^YEJO`T?Gˋ1~GcP| l:.T 1i@Zu#1uv}ELRIn'oj(K1\#!R=w+q͈$u|PDRSfr~TǗV O MW!wJZN]=S Z*vjd4Dz0V]{Omi 3UK xGm#vRԦUê) wah^hVXS&~C ڟ䈌0)5cSN8x\ܦ:K,H 4^@⋚hs !tfF$ 51k}_Jhyh zڳ0v&vkKc% \m3h/;GwV}裼!J -ʡɾO'9`mQOYRXAi;WD ##tNAA XASb۝re*bSi b.oK=ctÝ3  #C}|SQq[%Ҵv.ȡlG⒒@Ttdʉ{1M}X5y_#j^Orcn]!A΋V Bf͢׎h0(J0-r 1u<Zr-Ϗ aPХc=>i!]S:̆ O0\XwEs牺m%#LׅBrEAܱܰ(}]:Sf 1vrǐBO;X` W OS=Iרh8YbMQYFWJA A+@}bqQRl uH}dL(1Ga|X,E ~pҀ60 RAWD%J6oJ@p# cxٱɁbzBwUL+E}s]/WϷY8]'v 7|uSORIݭ%dʢFlNu+vnJr`qI4UG_eh8v*S\uI'D"K2/1wNɆsqT ]:t2D"pQ9唉$% KLq2ƙe$K+_k!Vr\DU!sC.4+cc{t{GH}eC c> jeܷ9*g5`ЪMgt ]rOzFF91 lXr|6?j1UĹf 7u:>u#Mn.bԱSaȢ5h'DW2"Rx &k\44R;=iՍ]Q& “w=I9Be J D/`wws ׃Su7CA\v~i=,GC eSoONx|/7p9o#BDDIkx5 (j/ _?Z}Xv}OwN+"8%&ɔNe;߾8t<׎?y[r' 2 HR;&eh1B˔h(>'Z a]ϑcV)3HkdT& _v $˦NAk3 nZfpϻ,g83`Ͳ\b&L{raZ|)2p#"(h*IR"8=1$D2CɅ7;kAZuļRO&(>c@rsLEn\0Q-g41==*XaW̨QRR E#Q m:IBy_f),@%; f`ckw3ԧ0FrOf>̤ˋ_3)h$dp5a< 9d -x^be 0#L[53,B \P)qB BN ?P,rP_b:/F#S9y]Vbk Ek]h3hn|{оfVOcNo b:^F_M%RN<,Ysl@4HJ YJ)P;;ȿCǡk?"[ h9d!C6lcM6q4J2 6`V&[R5K)<\ͷ9cKIXSYh礕usQ G nj'1$.Ld`e`՞fnZ%oo#b# HGf}iCr05]j'QJHGc6Gf.>G¥2/Bdz2 w yo,-A#3=b9k2 ܙ,$Ȗxi8 K`醕I;-f=\Y( Iөxgp/N͈OW(6Z FՀ|ۺ*IXJ) ȃ]\xm& ٹV!R+ֲQ< }L/ֺ1o|`¼h?gO Ʈ.q^RB9܃<)αb,ҎLpxCq3npH(X *r-q1c U&\?;;C5'yіiku̬ cNq1/4wadMN4"A1Ekr.a8N aaZ:ZEbIqfhsD#HQ2"9 "BA*-{57silW}3pgbX2 1E3&g!kER$ Z뤊J8cIBd;.q2P5PRng `@@p*x8Hs: *@1 e4EKVZ2Tjߜ <9m–`DI%A`%0sHCHCקV$Jqݶ n/ sRʭ#=IRVp ;΀1V$8'"g\0$5^S"`9监̅G^6/<1!;^ekfJ6!Fx^?h"vI?/!qϲTQ4C2닐qvwNc Xxo_,(W/ȋ_wK7W\Bq]:@7籐yO.`,G}}[__Og3p$pm T8>ӝ`T\֝J~/%H`F ߥ଎sV 8`@T(2UhlЬxDkAqR`J%@yZ[q%6"~$k8+ł8SUJF&kFH]ښղa<.𭔵 hRZxԊW;Wb5+aZ)ۓk ҹjo^χ#S*<2e6y2RBZ i@ւ. 2VAv&HB䮦C\]m%ggbuAd;_6 `obrYF@2,# )M`W਌Uű&leY^B"DyOg5O'e|1?(c(-c^1ᴷ*LRiaヌ0NX$-#,a餝aLNPxY'Sv}ߥiSq\S+CNʥVp$ީ6(˕1*Bn;ǝjE1 qI9@l)?ƕ2x܍sly5HSmފUsmkCoO]}ME?XgptƎ Jry{tiygu:k[|jj3OhPSް,Di%9me=(⓷AR|4#΋evO.&N}t5hd+I]Ey":"Px|Axc$C{, "gC\df)a,iUbd'1SR4IE8NLҗiXemMI#(MOo>K[++)<#ݽpy7gO^Q4.h @' )*Fq ]Mr,rõ Ӣrc,=s؆ H΅8 w7sfrT-ϓu?qk,0l37Mv &ñ>cYv>Ƹ pϥ{"g2aQrl=u6$\ZP2EWRDx$ډL??ח"|﫷쳂5g~;5};xt:/>ʹg GL<ΞBL}^# 6/ <~|巗ջ|Yqdt=ϊw,Gf7_`&_~VRٗ}77AOM\j'f7ۋdmzsɬ1T؂?ۿv;"hneQEoC$KLW{/cO߽ y2‡b2]ϡUR&a_{ޤggw_fi?yɴwozS ^}}3i/˟2NprSB$ѿ x':_ÇS^fCn5 5ɻ!bkz\gZzv8a$Go?+$KGpWWo5te6>5ÓR+M[;'Pf% S%$+*ýy6;7 0Q/B|O>6cG sh4v٩B94=بO?ϵTi*IBٛqZk,yFa3<7ȼԵ ɰ,agFdzx&mYc:*W(3..a6 ;(3ixEjFQgL?BHh]|qZf'_UUH  (b!Q (Ra28*朗\(Pj=sG@fcf0ʻT]*.̼ DK?rjy7d*e.#¼&t͏=L.ľQh?\f3 iK=b.E%sNYV)kt[s_q4 "N^/ZҹuQ7ymۑ] {XFjlA*};y^d^jVQ!U# .} WR hw_1xnw@M̽Oѷ{V;ʢaLsOCbؙٲ%R1e~(3f #M-꫱3w'4F!֊! C-}?B@`?"`>$@DJ P@!"J!TTtOԁ4'Y_*ԵTx's9屴) t^۟5eSFGm)YpM% R2:d\^<4o@J+<ц105zK0F KXh}4*l˘6Z`=s-݅Uh}/Sg4]Er$D6Ҁf{ N¨uxups 6CbXyIӫ{nR{N%MŨ(/i!Ff.\wmTTXkLt=%bqd@ 4.+Fܨ)m 3RJF.k|.=s3*&u#pde=IIZBsE@8IV AnaeɁ#/y)cC8(R h}"B)T>7/J#LJH(M}DP F"4 .8|D(3ؖ\ɸ:*hn]ƙYn(3wֽ"pٗ0:#}fǃYr9/ 9,Nc"r GOjFhh\9!TS#&b5JP#$}(Y.+TG[hҬǒVHҋx||{3XO>ǝg6\cSihS= [cF^=3w]Y2,v06)! w $wŭ_p2%=h&CBI *LOqxK攌63ޜ|M;Vo 7ZOh}ti6=m[ɠTSĴC}\4HG=!fl\$ 0^(MXNj6-̹e,ByT׹o>ޒK@Tuǩ܇t~ 蒿i@=rnT]'IfJN/öQDpa > s:Wtt^%WŜ.v :HŨ*t!KF.Gsxv,@Q-$i=Ą4I[=1!sNސ~zAbmp^3%VZ1r_FUj+Ş?Xa@ /d>WKD VY-`)z)y.l~JͶ` !uUOޠ7)GC.4fO{|s F,mA<&~<󁗢#{నH|r<5lHܚUoAiN(ݒCOYTsAmޒC!٤퉼J(|RZ7#ݷ`A}ŘZbt$>akB8@(2 )R`f[u{|K'5zVGpAj/>[Z<#)C /z$X$% @'L?c笭U˚;0EENi;V*v,VHEF'&O_ ߾ި/q4 M~:)ojA 'k> wR'j ՄIڛ k0ov %^iނ7 }sѧ|w *FynF[JQ~w e~4cJuFJgKMI}ynm1Tr߮ ,@Х W ̗ S"A7!aԶ~R5v/::`6W8ư x)|] g OH١cTI ;/q`B>Kr~!OÀ3/&Ja2a(4 $Ԅ)E:ba !QD~Tl\(k>8X`RXYu2l)Yy;wp<&m{ ;apet>NmB.wȞ$ B+ 6֑БOw}2ztR}wMsw P]~'v;.ggz_rNnۼTbbq d_N`:Q ɞdv[VKve)dXY,n-@WΦg.E]]Kir(J wX{~q&} rq*uz{ݓO N0f2L7Xd@G1(I;W6zWo 8aaVd /7 )Lg1vl!Sa+2: rzͩ5q߾:O,hʿoN U6T"LQk#p?_'z!,hO)23\TCTBC Z28&['B@o VҢxt&F.ฟ#W/srXmFZ*S9M0]ZmŔx*kpe8㜄:~jtMmteS.aMoR&1,SsCyr"k\UBkcb Oƥ,sg5H0H }$}׸fe,?1tNG MuouS99u8*Uh  WǷ /eA ܭ^V{AuIf=@ZTϫ6$Jj٭^Y?{V w(C<.º ȆtH;m \l1L4zM\vټ K0|{K)Dsg0m=x?{Qǯ"'l0 u RU>l@]۝9*laTWxUD1A.f{Gص_7*+A%f;\~u9kw?;nv1t fȅA@-/A Rc ;¡eo8޻5Sz Gw_b@k %Rx4S2Ie1 ,2Q r,K\ @Qߎz(X3&d"L]rAU@Εcʨ CE[:702"-AKJY DV@>ga1׬UY;!;|lUP_\<{sx#f ͇[hk)gG^\}xv55ъ7zW֙dP~K#xzQ} gi hD>ۨEr1Hhz`S '' ocbeS,'S /L'@0 )y Ԟ!Lf1#&,}C+!18.ˑVXDtf`#z-q`y/`u4a+s gK ',T*ZUQdo+K/Viά@A(X#Ţ{P0+JYsrq2@3KqL$C3~Y%{=l N_\7g-bQ^ucJ x gͲc<'g)䊲pG|CXۍ!i-AZ&;H^2 P4 Bd o  %+ l 9sΫ+4rp4$qRǬ+G<Ҽ.9R -Xeq\eznhO`=$ fqXriY\?#N?iT5i:RLʨ&)+-# RMGj\̕PuY1M=Mx搱1sZn$Ŏ456+cFhmԱB)ASt L<֖(2bHkR@Pv_XgQCLa+t_- ś߼ZQ6ˮȑO?FIAXՆ8VX+|2-*>7ٓߋ{ps~rF䓛\]'[%_Lၵ{y`ǜ'WޚuW~:shi!gG%Nۂ/7~[ώ&(uT &OM6 !sM 4%]8;*#Pru{s}Kcw1 #Ύ\"jTٗ'68LDI41E 23M,dL!T+aYx/".ql]x _V F> Či qeSQtI9 &[t&n> B䣋Sơe!@e|j$ltJ%yrTA;fVJ3gy9 i1W1>{"fdͬ/={rba3d,4"\C*hFh;l[@ʐ RI|s:w(2ąQv%MrdP8QZɂ)f3iT  !fM,bIX˽z g˖n+9i#)0$fEB0*eHGC_R$t3@OCBcL4ώyB;4d) 쁨Wf6(HFfB:h&:̙4WO4g חO}2"MKhեiN@59`lˮ^džF?jVǼZM#F%'._x%]EpQz2z*&f3JMfEE ш4Ƚ輈snSer_vNqÝAl7N~$lqt0NeTMjL9K(2;GJ,ltd D3a(Lh5/.RtO)HyJ+4󔻶"lKJotWnXz91ђ+6K")Ͳ N@R$(A n Ctr+RkkP:kaČORr44)'K1BEc+yYgTk@כ>\dKT3{@H[ƤZ Pȸk(Fh /|3=聀9 .ѝ kE\+PQ̥+# "+w-h;⸬!&t o*!hfcJD{Xq!€ C=Y ڝC[e!g({;ՌUqRo=ֆS\ĎqNJpqibb'PA8fvgn|]vvt&G{cf8`¡2, VeJ[KY̧$fjwbǦFBYrJk$\<9S7oExB‚y8~9sE w"u)[=\ߖY:+&?$(O>敊HNBT~E&C%wեjl9DyLTuW6#Ѥ}/7]KI-*Pɻ2ªP-<2*S馱2u۹I@Z ٨bZjVrYhzo%b%2ݵk]ۓŋ/-i%U\ۓ˝l-21vsekmU))= I)3ևI l>]o-!Ci(=4\ %Vb0{ QYXaKmd]K315n ^vea:m }M!Z'r>|w* MaEdp;x~(̅3v9y{%ٽ ag}I;O+^O  ž"^ӜЍݒCƏvN+o8.?v]Mr{Gܦ9qrʽ j0g,Z-N+4Ҕ ۙeXU k1S}H +zO)u78Sιvwu+=wm=nrL߻+7X lI%+HI3a4J"Ŧ&)[AY}}ۼBu!cJ( dvhy_9h@x6<]DTW?xfyZh>;-ƻ܇Y k4zt^O_ o/0ֿƓGdz\M q)=?#i=Ж✨-"RJ>X'4[mX?9bctJR JZj}W2#'?!o*,4bٹ@IBdJ] !G4\%"Fev\6Qע}"0w~2'*'K-R I:)JwG;} dX-X .d HcZ!ѿ㹫Z->qaVIGzflGōJ}ܨfn8ĶS^s&\mêmpf7f6ԮڕS4p0" }w}3.sݫMk˝.5'\9n {l*vV_]Wr˫@% :y)pgQKw '&,*Ƴ󎜵xJE(AhxvA"]U&؍a$~5G b-r 1L^0 S@b0ݍa$ _yD[Fl6"F95P`D'~6 wqS^1F -8)PXuuQ~s> JYXt2ZG 펲BFd!kP#dP^B֔@!Vc{}exXtQ%i]֣0:3w8 mE@CڏLqGdJ}rIqdJ]oo:L :t{A BⴄUumZ›z[ݕxZ魙(Q<ABvwL]#JdkZӇxjI[62,-s7;]u_zlrZ,~ҭ7;_Yb|77e?As9\-Ѵt*hiTt'}9[UјhLCYͷ' 5\~^<>^XxkBJ6H%mCi j w S]Vv9jёgȫ1lcֻl 8b!IwW7[H 彘%Ɣ_\GD!XL/Cߎ.\[J0K#LGLpŜ(qaAvҒ`3(3fHUz1,L>ߎyT򍯁nʮu/EOAlSuyԹ'bc7~lz;:6'LA9s"ۖ{SN|ّub]p{rt\XrQ UEnj]\tr9bZ _'(VoTKrTPm/*QfWwfQ+|VcUMʺf.~ya ߮# /1ܹ$@Zr @du.6X%B(Jbb֢P];\'A`i^lC~yƤ Id&qj\T}ha7SSL,6#, I] DĴ%Їs TIƵF\"B# Lx"2j M5`rpJ*b ]bd M%x12zå>ȩČra)5\%}+"Tdڍ@B™-\t3ʳE0JdY0g5L1`XJLBb /~!YR抺k.|^ U]Xb'RBTy1aUR؈8LH2̤Q`&Sv*5V!urƌ#eD6ѐdR#'}2N.YqyzZAi[9_q;vx_Qdrzk;7~WW!m3Wl<_,7d<+pdq /=Wl5ؓh/ONj5TX #ZqDF ?=N]/.7v n88x~|-/b$(X%|zb"v`ړ3JP- )8}] (*h+u#&dxmm%KY#` tMgk3$ _SfD nM̚hQFjFJSB`|Jx*Z!mAT'}K7$j3UWr-_&8C8F&z]H+,RS8#;#vY%;+M(ժ 'ˏ_d83F !R3zyS.7*ŋK๛@o,KS"=H^@ѴhOl缯X9ݥ{z b %Df KMK(՚$4 R$LNQTI1:18`b%\kC2Sb@ŊccA Gqb0fQl*űΕ+eNW꒫~XŖm(ۤtYcI68bFV(:K FR&4vgR. ,hhd˔P*̈́CJTbLR$xz+MOտ&Jfbiݥܛ^pU%v6L'TpI{Q<{yT"(XbAA2 JqdWRASn_ ̈Ɔdlne PLM(Ց$@HR##8Z;Bտ)fόEGV(u[Ȯ-GBz <cK0haۥ۵XV뉉v2?|\M̓#=]񼺺y7qrSn|pv B?4_{cyCӽ$lEGZ+!vfz`JƉ^uKr؛()W-jA@KUy,Z/R5ȷ 9K*p.;#%΀1a@E X0(fhH$ij]tbW1*Q (t+Qa`ӔZ_~5WxW@kN(DX&#:(/€Y;MjWcJz`BoH\ g * pbTlCϐ 8SX2Jդ8O3ؔ5V;g17-׷Y,^xt 8Y* +$ T,%"qRAJDȄS]%`q~"iȵ 1%Y]nʨ ϡh˦+F2X0ˆ_xlG|^/YwFH۝4 6!y퐱eXDMk*_17nsx>x1øflb()'7Nn[*:?xfyZ`c{\Lx GN(z~x79wi?3>x񴫳ҵX}(c&oGޢg}'i;66.mz3sC#jut -͓`(ƕdTSv϶]tY I+jR"1DLD* :?H#)\ \G?ٓJvY&|{υY TX-HR 09qbvaEEhkg D)Jp%3C0hN;ul!#-c*SJ|1SّoT,H?\>Zje0S2,!=wSo͢낾U]+$7~nJq=_>#,㻩x(%wL@IuB5d =փ8_rm\5ӌ(Y1A0IJK^\\p ;8&b0EVlYCVC>;p*/vJqĻ^#eΑArZgZ{F_e1v". &/30f,9z8Z-Zisc&TWWbo.Jur.0hAP$#eϻ{59or8XruNjJ*Z3$DS#D4dT>'MaHt%H. i3GJ--82bKEm9$-&gzrVI+ISz-]+{4UO-,1n-q"^01*Ix _∦Z86i?PY.ֹ\kN3! 9: u\m/'m: ;iX+mOhT< "&I]3B=A0k,KRWtگ(0݅50ζMވ(^u@i]>_*;PNӌNz)I~C4shtP7~ܲqU]~> ^j\1}# zP>wpPնޓU tO0UO= )+_ Wnz`&XղŴraH%^E`$۱DB <Uoyk[+&6T5/j-HF4V5E*@3X3_JdI %$/2lM:RY߶ $fh38^;s"0A];8WV<[vO!f,M ,.߼War~t٧Ff[ٻL#FOxl\d]/F]ދ3 0eDG}Ucߘ16+oz`(p >G޽}"sly7S!xy|{#!=- o7t8ךxO0 G?^-dgߙu1^ټ&Bs7=}3k<77ݛbЦBوZ$+LတR23}1h@(ck?w¯`/}/ ({6J\ e_}zk^gt8PpjeaB$u:iA)̔Me x&0O}>,t /퓦냲z5Cll11[> ?͗o\_Ж~F c \:`O4}H(0:ohNrj JxHc* 0;G!B(P B1$R@(۠D s,15b%]Ʈ-]D,v'؁SOO<*x z}Q8S%DWgIzH9<:B2 ͧ# b͔T؉m@.1L@snC" s`"XU7}WߒրI`K"1 8e$YrN 2or?O`(aII-% ~HSn1T$J%12Uo `lly'%*1%GGGKό@PCʑT!́y l>A<U(u&P\PIŅGy7j5.1J )쭰7>d!Ä.>%HX.OUYOP8,)I+M 7 LR D$8IAJT*M:KB\3d)J%I,J D#Ln˯ ^+ 02Xfv;z)H|@f8MP=C<7dq̹AUz$$k(hO5kM L?K%k 9S:9b(xJ_d@nL."FEM%9Ag Gxn1|XK8F8׵m\b|{RkQٵeMI^]h)ݫhjf.x& phJshg&UK 7\" E@Ʒ-ZDKZWZ NsgæY>#M H&z+:)9\me)tsF{ ]8e4ŇļL?&H1!6/ZI D6Ɵ7#eKEHyۮE^ԁ_||~7{uuD@.1ccƳdOU\&㫿q|ūƣ vn6vf q%`t"(WL1RED&&#XGL)iH&(u2I4A E@ɲ"~hGc?~#Ԣpe  ^youQX};uhX ǛkrPn?,Qv PYtB{wm"W`TE&owgJ̹qа 5gB!i,2`p}Q667`Pf2GB7o5Zt|tiMdH2:*䍑 'A@yF!WHח(ֺ/}mĔP\wXj*n] Ò0}ŅT*5>Q{V箵(1:oCߺq*%48a֊ )o0 DZqk}`7k(gPd;<| S""ۤrsi99 8gqe|\S7:h\VmmN-ȶ "h Գj][^寻Dc\̿HV5&g _ˬԊ 9#+F+o(%fkqK  ICu_cJQߌ")7RS÷FYM|wh+C5(e:ѢMz"*x<*ñ?6eóoW (%-&:W%41aJ?9E%8X$&Y*k;1e1הK]#%Ņ{z_!cעd-;.ٻɔجܚEv L~u{a_sЧ'OAo6_hR@gw)]ߺʷy7?~p`7x<(D*Rp_u]y9*9;(D7QH?~lԲ|VսΙ|6Q=]`If*մLeo<Ϛw %fŸ|/?,y NnSh#N|c1*#0}ZjRY)5JQ~E*L]ǝ,@ O(l G(}?MwN'#Wj2"E"qbb"mh%,%XBvʔA6@*}ϐnJ_ ]p2"'5 ǻwKR6STj8VU_S- ԅP͡.~=a3:AC@xvϾD5)\3E(IH&%fyH% Kr?a$f<<W jBHE, Q lasNȓrS@4Pz ԵAݳC٢tƟXm|rq7nXEܛ5'3@>IپyiIpӎ+xi1SjXc{1u{D|IǞguθϸazt'c]FÚQxX 's,t'C'\wיG+Kh}ܖ)Vү\QB&a Ԓä" iWIyԈ0\yaY]aQ] "ptEl=NYrpJ1G+I 7- 4|yݒ&Ev_ /2 GHz R5 ~}PJG.w!:Ps}U]QJ>E Ѐ9 Gx4<Áh(swNp_ݲ0 UkE_ᒇ;M9s"P+D}r-}>9>|,ED-6Õ{>oN4SҧtwqcHdh_[/iJM+VW6ζy#HllxsRhKU5|mu2 SiSx#IٱQDN9IyT'ThUǎml)"*,!y"BktBW\᪇g"`'l=e$K#w#`ō۸jt )GoB%&X8H=_E͍&7{lD"| X´āu.Y0̌+L8.HGᘵ&7H+O[[*ĕf ۑt96?*TtW~UBXݚۗaD1|>K!3UTܲV4F8*&r2Ҋ6Ue+k-j+;?>\gZ8Bmn3Rt C8~BH.E&)r(RʼnhIk~ꮮ\+iw[-KX)K8>a.Ip(8\MrdCE:{9 3ԳU:Q/p|n;\Eì#]t\tNi8t6Xk110 ^'4#pv(.;ds i1fF9ɬb>C'%\t8~p!t<>ӑ1RaϭHxoXNyhښ\?HR"$vEJp%0nk> &xppHW7`" ``8Y&֏S3L *{0pf:Mώ^ag^k h=Fse}ʖ"1IQle2J~iݏ][,HRLڴ jvތ`- B TyW3{3Lބ_?}[0hxp`ez >{("*׿)VN]7_S]3U^[&୯*峨ՄFx-[E Sߐ7>.-o wa'}DpOB39ӫ>Ll6!m0pۇOqCf/`@!`K8Ge v6^hH_5{-n Vn(*sY+A!FqBLm )H$X*^E\ŋU" *sC,S|!e=~s1<];.W n!kxVqmz, c:K;~ş: y=s^{^x6߈qUJ=w;^i5?bM]qBZ#ܳ;oz߃_cd\o9Jn /f+KpT9Z;C6CD<  /x\]~}z6(pB#X0p |"Rbvf|es-0*Pg̼QF-3o b:CtWݽZ ǣ(`fz_pi8Z kJJ!spRZ֖FWL ѵ@UΆS;_F;ZO U*%\^_;0,PЌRdSDH4:jBzy2dRhQdͧv|ؽoV#Jg>}xu#V9o"SUFSIL}#9J<_ڹ:TX`okfEvabxtd36Ȩ]Xx|NqD8(Cw<$"MJ̉C"'}Hh+X{^X(<3l*4EgeF7|(8fǝ̌6:ymp 1BbPK^+p ɏ/hM_PV;bULϥ)K ϥ3K bH?Q[ 엠0%c'rC1Mssϙ+|Z0=||: -ʟ@->8㕟)&;PU^>*ə{wVg[^ zx:Д {bMݜg3!-zUԔfΔN)9Xuԋ_ 9|/]_ GǻQ0!n]3;L#(7Hۛ|>t5 ^ ˷0i698KR~p~-@Rz|7.? ޷7Zhq0P(/^&y4 1Aү9J(`b %BH$%)r ;Ǐm^l'ZTS9] 8 ej_E8{V+?߸vǟŪlݺa\W2L'oFeyXTE(I=ݐ$O3SͲ|ܣ 4e ''e:Iegk2&.C%%EX@YEZzi0 kqL`whzJu=4+NEQ<.cR)+<cv;lF8"d*o Qޠa/@EIRO58%4kkVEbhEd<}Z,pyZ;c\[rKӗiY_U׍HΒEnI 1QxmQ n>LJ_PrT>-<[]R yoS:zC[hFd -mVge BdAz-02ɠHavOY #&e9Ik  %'\&Ll,+fwC$[_y`?[<SjA͕>%hB{ +}`!HTtjxxE(# IӟSыҳ?]k7U)k]06LؾJ0V>&l7Z+G75N) zJ-&C9PTYKVH$^[M(~*C=+?<""HC!f93*{ 畲PPFT!_h6o9րRNe^mYRjG3;tM?<\Hș緧,&(T;pVXУY? `i͎>0&Aqٖc "J(e7'ӢBO4.I L$V,9~Wa (ǢcǷvuW&ڠ}xMLEG880<""Df5??^ =2(ƴЄ# b /,=Vl;\ebM/O`)cZik`QaSSsͶ#Ct`.ADSI':'D*FmRf5a`FN44x`\2okV>XHH]6 ‘#֬#GHH W{x<<8ޏ()p1/smhr=SМyi2ʪ|}w ~ h`m`:x=C7?YC:}Y7osk'{&Ygz*3J%M=xդ8{ a *(Q'1UlfJGb~{i}71f=%Kh FVDF&h c+o4ڙ"#MopXqtθroX.(7Z_}}/ =غ P<{oQz0YP`D$&un)8xsWYҞ(̹ENnd|x?:w,8!ut N8jlwM ‘9j/x-%akGUE~#9^X\p0n GRQRJ8FXvkBXSMF$TŐtѪ!UT 8V ]P_6zZf~?V:6ԇ@]wkC.qIǿ|Sm6_T/~7_!^Ib*"Z47o0lX.hvkx<{}wqo~ =^ <׿]K3ϮCxw]MztF~Y+fdhiYZ cbhkMdA,N4Fqst&eFѻ]-'q CEV'?Y)ʥU~eqƿO7W;_I5"j90pc@ʚtlMǑ/qxj_77›Gon=9?-w]oǚA*[Ht.E |d0^+l}[agqDߟ. ڄ{o{N.БHMH.:m LJG #ϚϬ]{F?đ6=wlWC`&b.t1O4[g+kZSulQ9Xxjߩpvzk!{g_<Ń8Jh'#G;އ.暀-CJgvՃ[Cw0V${Ivx6~}}>Ew3KJ<j^I \0(jTvi re9yʬI=<ZO{^)2nLf^y}eGq`Zl㨱 ALx?* ^՟emZ(j:qQ3E#\ io𣁳/^Yen/odr{ -OI+s,3V~|aӀglr)i[&kne/ { [T!mgϫ G{^ zefUl*&MoZ[Tf=08̺܎_fe޹ˬG+e֛W2/"igz%aј2z;n@2z:^vJ_f-V1h<^iBqaQLH^v\ (8F8}H¾(VޠZ/#x{heUd]Hɡ5Nf/rژ\ć3BJJ8\1*27kU2a z|^/p }Vy{Wwu{Ww-mQ,܌|X"*ҒM8eȩDq3%ՋBkIhp|U "@S&i 1d!( /J+!+`X@&$+8q $P*ŝTg$!H:l,.A`L˜d̎-IuȢ⯾?1&h mM3W5dbV#c+.0UJfk#G"d ;$/4>sD`s"Z#CW+c(FbΆdo4FbVgEAYCO9,2nM`Y44A$|.{)|T/,X?C]tAsÉ0_jW HÄ]dP]JH7@إ]3aNCc8wn (®>C7ı:9ድ\m}ȅ-;{i "e*LAdBRM&mHD]릔9-ŋENXWw 8@ =pȐFNr09'63C凐ᄅޏMՎoqB: $FG?Mc!Xt0 eށ",{q-- Nr{|X'i\!`s47Rem$ ʽ6R{,0H:r:ZԡE$΀S3ƃ񨔤 ? k 1F q4B'EƉDM1DM8Ig68 L]%lzg5˸Jj#N.gVJy_"K܏UÓ|P?uy?Znkޜ]ի29(8l+cw,Η<Zq9yu}l6\D#eWlUy%Y,Bz ?Xj-PVs0G׆x%=2Hh阴s*q0@2F7>hQT]]Ȉ=%Jʢn$kxt|A4#ʯd-nUt2AH%؀4YW_e9Wu1hrξ* ()R+5?x\/9]֌n8ǵk$D88zuf_(g1O}1n *J.%-MI6 VXmTNWAl5A4LDэ9Fku!E㖇↉Sp L :`q\L!+<.!%_jqCʚorC!j7}Z,C3>ϟ{*: >!]Qnc|up7}OH\]UiP0޹k";qȐ6[1y]viq=nN\ъRv$5dõ> ´GNn_ɩgv1"BSB*ຫ٩ɍ+qCQc!)˹/9RϺwCUM V!!psҩ-MgWVtN]y̭9%)R)=ɝ쇆OI11:vSI1zNb" Q##u3)-28ݯ7˥ʨB|l.9>Hr-YwY.^gΫH}g̘pQf""(BXTH䄍DJ)H7D/L$ ׋ R Lr !DgрIi`AhQu MBP1;`*ZN_Ԛ)>oXQ(8:GaՒ˝Jp+Ak%\Ƣp4BEF)ƹl>h- h 6veLۓ\I벝bJ\r TqIb3$-a43j&>dEvR9)fTp KAM#?FGbR|wC^|}A&gG؞]ّ.Ϯ ]*::Rcsŝa\\pc[޷`$^LN _h7}a8QmK xs\'t漹z{vf"Ht:Q<ɒuL(=sRF9<Xy/˞@`H:ů rV EFt%wgeb/m19\ĴrrY옡{iR[bT  (!PH#"(T 5Si" ZׅlmBd)58,']l`q~ #.')fa %_j^˚o^Ԅ a/t\]Al, N 큃&LMݧ2l1~l";qА+=Hv-~~ bTp|4_1[GzE+ۨ(zEK+\R 3Y>k6?V2vxʻ,S垕wIr:=IvW'8R@u;)J I>~rB{m-n{v?M=Qzh9 6QTc?%B`u |?YqZj&%<2d̂iIYeghh]lh9A>^  dW;Z16҂eeZ3EG A'H&]2B T>x xw`LѨs/\r0 k!,}5*u4FhlG k5}OWܓoOw89/ gjj/8$t)݆ q,~WTv+2 ]A;h3rR &(@1~1vi95!S,bkH@SGƊn)ڼI*<:bO׳9 \HJ/,XPf\3㮝B1kQH=$ks 6^ps.a7Œ!i.NwWu GwW ݚ \jKCg9y1괇'8{ygHɠh:7-yFoZ|orcSXzVT/wWDQXEEAbKa:l7_c9/vZ?ds .L4~HS$f+Q7"5" IG}'OJ/Se]W/##uO 㖆;qKgWv1F[Q1_ gMgWfRap2sp2}T( tۙ~,Ӟpd;_<)&fYOǦiϱ&=+tPq4& 7Vה84{d^$@՘“Ƭ&7_?Iq +v@߿3Jp[x{wM~_q쌒SF)9EE>E{:_Dܢ%?b _@h:V8UblHXݓ.wmiuOt+[;^Hqs~v(15~UG{]Fz#b9պu6x\ǬP`vJ5tOѯؘ\o'2EE~~OO/{'ɕ|滌o6&3py9eڷӏv o_,!ָ7||su_q7RZ3О*ݰifT!Z$қ@IK $r(#9.TBwFbhM*ڏC j)i 2֯{2/!Cw.%ۈ:!KhAJ{,z\~)Wh| (4%M;M3Zèf~v@)P޺"`BCDZzxy!"t.>//׋?p~S#g)'6\m?eY%i+jܕm]ٞXz|;剟gWv1JGJ?\$~OS%~B GH< I\B^>,쓦t/nvmi…NS:w37ik?gS#>9%Qοɾb=&!PB*.A Js7U]O s_Γ,5E>@~m\]ĄZ^t?#C>B>B_=ߜ_5.D"!`rPZR y #䯷'?\-ӆ5}k_2 '&˷y{o;Oz!PpEjKPf}p==k-bY`,ozp|ٌI`Kury8^HipB : p5IAcS ',s,cHVNf2 %C(PaX"5R6bntxJZZOS Ĥ/9Iq$~nśbs JI@G26i@фO_p7.'ʶEU>9'g&ޜ]ի8+*rzwM ߬mPC]/y7Q1R駟y c!&/Qjq%aS-pPؖ够B@[!}h_W0䐺R.[Y! idyku-P#jB4Õo|Gˠ(tÿr D1ST8kVM:ri*&b-4G4Ħ5ZGof4YϘÔW>%p_wE xDZCKxh^G~ R ЀgD9!}I <j"It%VG! "F!I#$8#IaG%^H9c|§|)4y-Uf@chr(nםq3\J=h棌D{n%B(q,A6#җ٫x*8[MRIޖ @[YH3٭נ$- 2IYJ踒HӍF핽Č!D-VSvT?0až9%V3v`6VimN+ |VV1gZ \jʦhq$*Ged j7: v!Kr !6=LcBȼ@Py@}\ @њOB{޽|4,&Wv"{'ɯϣplM:9XݰD×h~V>޾Ǩ> +m y.G11!yJ5!CG88c"Sm\%y^}s$ ];Iʳ[I dqbpKFn_Z(zy^8x^vZ,/,ƧDst6BzҫJ4D1ӧ@zGsxKR OuγTSK,&37y|qzɋ9h< 19Y}>N7o_xoޞ}yuZo2b䯗s#'^rûg4gDcRғhk#wQ:4p$d4?SGÛcC(W:-߇· hO@O:sWT2bK7z9ӹO>T.k9(_M$OOcwRLf^P!p:dYIڛxXzovUmNtuda 3̖ f]qDžUg:kYZuqk*"(x@q-co1"]3[ۍ;mHS6U J@h .h,%iIa],zkjRvi |/Sv ޛoZpM^|mq"&-t*ꇒ9c愖7Y=.e8"8B`ȐTF),XhoLNI3#(g璱5qh)4%#x?:ʜB8D hnEp3EE(Ō$^[ːRaJٴZmA1LaG/w OjYa6MfWnYI3Z q{(Qh]=KI]Y/&&T e!_i+:q/[~pfw9{wHW=HNo'?"I~ɴXp\<8-NwfZ^xχh-(B o'?bZiK~ɐXiyW^x0*!gѕb5A v{\ص54bN;|ggh:=Q=z"cŽ m+aVIQC)a.O<Վ .C8WΕ[Yω0TA:lu& `{r&&',)Eu` 0U +̕uf"׈0B 9 GD $4h Hs'1QS$RAvzNWp$D!aG$B`l$![-2DhNxN4Q{ iDZNh\ -ri`UB&-\8gwX@b ˄7fJ Mnkhe݀kܖ&uWo$A$[Vn{ш@iHƬO<\h;m徇vYf8>oW{ q[:!(oAg7ïN60Ct&!}ږI<U[8P; i+]p#XktFy#DuuII:h# k/ա5Ҍ4΄. v-0o.pؓn^ZGsCcԴ)V1Aa>H+UKؒz @mM|Q*$ +ST JFǎ!K~6)!)Z{++U _eՠZbT!r8P՞bUX*:70_sY:?[^߉Ҽ }KdkM%o.ad12%!P]ʢq ?L %dQo]d/a=KJ秩 y~}Q`Tys`֯s/U-(S0Z`mJJ6*NW%}.7#2aL)")/C )Tz:sPYnjbycHܲc/bMt|X+7k8LҲS$q<&g;$C SI+,54"]3=m2 Ō83BIr: sڧQ25F##=r*os*c=c^k(5a z>#e^Z3.ͷ뼸)(6tP W,k5Ȕjٮj):|1JcJRF[f_.:X61dw̍RIἵՍ#[N\\Pߍ*BToSp B]ٍmw( s)%24Ø4$!1{Zm1bv'Bl6j*-89Gl3 ײҌ:㠍}wZF`Ƚ!}39ca7~sž !Q?c4`*'U3U~ނLB?U$q4giZZQCm)eLA7"02ar!]lP-57?Ytۚ1mǹ{X?99(]`\碜kd/ר,!rw/Ɠ(hd Xx"![IhQNJy㝃ш׻TOrjǡPٲFx%ӕ+J>+HYܙ1SmK 9hbqRA7<)Ee&ZoB1i6jV rpvTNש8ŰQGnV$|n65+ t` NAFJ$JK32羜  S $0b {(O(wRޏàGi槅b')^CZpHiw59zޚ̢0CAri4#sL%8-%b:% 䇢3).|e/"ڃnSt0RY =Fow έ#!^fC`?dC`Xb%2i5N{'.#& `aT)tN-sRLi⺃?]%U;}DWkd>Pw-ZO|[0vYUb WWqBNWWF;5w#ǠG_ϩJ̉Ezi13/s޺^n˔V 3y2`iX:Ƽ#P]YP'$M TX=Mx[ex?Cx?+VԖrҹ2YJ[5,mZ!tR*"3)5gkdýfD2ڴL؂jNBmC8j|4Ѫ1'Ct`m,Ϲc#.3^bsk)*A+  m[A@ɺ:ڵz~dFᷩmDCM%]JmF,6jU0c2bt {n,PqQ9ZR~&`Z`Z9\UGdm~N2N/U;P5ʑ YĹ|$0̅9S QqCm}*>QmlSZÈ8;&a)"Q!sX6>W8)f5e;U;Y3.*:r VKSL:<Db\74xV]mȎulى(Bf`1-&cs#NF`o.['QцdG)%h*9!9IrÌkgH%LsXm:sqc ;o_R"롧ƜG/)=D9x6eWV4sŁ a`~=ꇳ4#_L`E.ɵwvI[-\\_-pJK n>?#g*L[ j>40iR2E8ZO맗=Γip0:m2 oI[L >@[`so\FG+VW%}})i]!CvM [q_dE89ٰHx\I. g/i Hٲ^y,=g*#%_E0 aL pTGlV0UdmkqHE L/,vq%lzHstq,O5(5$EoX"K_=_coG˶Pݽ˨sqZ}8_M`JIU/c\tLkƳ燼/ )X@VP $I̲ b-WW<;'ۛc DH 2L0VVIbwHBB(g6@D( u^z[pӟȅ=,#X/Y 34OSᐐ+ lc9:m-%  kd{Jΐl}n"jVI.8EkR&ZQ h RfZt$F Rc1|>y1ȌSm v,g;~[s0DJaB"*2e:Pڔ3JS  wrb-@.ZJ D\TP}tAȍZ@"[:WM[Q^o/s_]K:`|ǿϦbBMD}ZYH I\ۄ 'ȯHEȏM"~pd PɚO+[t\>-f|kt )bdyv7ߦ~et2ZڧCnrdj(BB@1ŭD53nދ1@:cFGT\E+R3X(ÁI=i][= v,tu(x L+O}_Ab\H J4F(Է0ΕC" #&+uהJ {1וAO[h1TW^1@S:z%f['d*}un^f* |I~'gײ::٬QoPAd'`hoُd#P# Zm̃rݻ0xI[T$*% .B\;V[ֻ HlBY2xaÏ ^-l%_pw rȞa,wZ`ݛn_DX& g|5Mjvfdy )=wfC(١|4&ʉk?}HK:Cz"@H>6xq;_-t}1U;+1};`h1B$$~O(%`Zoqbn:\pv1-56j/5٬N }Sd!:_o4$vz3;W4-=-46<)݈ޥ{%E9P#,/;Φ,+/-8m|~DhHzYv~0,!{xpzYaK :|nAݻK&LҟD]=a&FkEja9 t͹Q>][aUY6" ζ:hii4'6iA#ms9H56YbsS'Vd`t6!&C|uqC[ڭ#]E0Ot ,w !20 c Ic9VޑA ,N.U[Xz՗=>-Շ Wi|? .c Q_-#"HL8L,$&8"*5"Tڌ ?2 vi0Nv0e4kSq!9N0ējG q7rјF( K֛NNhbX&o J4:r2U}И(^ S =mTk-|WBa$*;,N4:$9%`IzԊ_ IƃE|F9zf)/JЭ|(QLk"Ly,BDQl$?8>pl3gQJ$G ))f!TEutTCУy-Vɒj Zri QӜ\r,-z769D!Wp"bj9=&.Û/0D0B+J1־V`' {zk1Sp~7ڤ W1XRE߷?];>8o߿s㳷$+"*U}oAD5m:q>+O2NPG C.%A(p$.!W@*͠3j\H\cRK͒uuIzҌ Q0٦(nPũ8@ UK!"!Ʌ#TIj5ua| ?D9NH{tO4WK;CѼ Gh9_-Q{ & ]xŭE{.յ [-$\fwtȁaY>p>w!kia/Z-|+|>{͌쨻)+sy*dc*w~SJYyeXW߿J%2^YG7^|6|\_dWҔchheQ|}`_Q\-;EnaW~wFjT ‰w _)zp?*=XG5z09u/[ښ&yEoP=8^W\uVIaft_᢯z40 _Yh L)?XU]=N7h V{tϖj|5?1_0egVK~ #$/.Bϛ QΉ:8\[{Tn/VҖQC R-C%8H%P sevuß72_hK4g{kŁ"in$tnzg 5")iAxbBkҘ@)p6[f+,?DE?(6ƏnWHd]@ áIGu#x͸)KLqPl1&ѩid)rA'7[`}>Nf77L3ddt n tra =Bʜ&'ޱ QGB b 97ݞкF\!xǐDޡ9dx&Zr"RGPV.=Qda!oضN|u_Ꞽ rqZ2 /)*o4(xI[ZքԱaҸyhVژ!N{sE9΍\&_H ጗t{ I+~ٻo}I,WLjգuoMSߺ{!lgJ}AHU;ҽ6 X=w-&%UܢҮ*J$РspgCEqdH\9ݻɸĜァh%`GUѿD5d\Q%폴o2.1t4eVܿ}[6]>G?EsLyp.h)$Z/.wBGBxܓx;\Rc '; Eۺx{3Rb1Mwp@ŰOr~4G4@k4lH"<<^zRG{} K Ub\AYヂ'V1wUз9FfPVg.q$SLDs:EAhrֽ˹mLo/;Q~ԫ\6lYW^qݫ;ʭFowN%Ռ7mD(#äTWh|i6y7~ϰX?L} P,ZyA^Q`¨@Xj  [ܾ|xf(C*lC\A{W9o(A1 :xT=.Q a$#E5TJP&9U߿CK򉳛 _V>_>wV(G3K';n[>A)J4rM⣍B)s4Ųàh.X`]ko#7+,fgeq2`,=`K,Nw>dIK/TVm[[璗9XY %Xs7E>F![*(E׃.+'ͳ}Y,Dukcj? I1?RU侌@O ,{\yp֥0O|o˪`Ňe&羸rIRP yMVŭ>D4(ݎX1>-,jhjtk&ZMQ`;klb$,{۠ D1e0cnf~$X=b{駋d=w52ЯMZ$_#-E|*$ Wm{!fxR9jvo #.p-G⑄"qoV;$!q[1tؠ7Y2bS{{Ga{xf/=СNN5a74F`3 wFxA"B;MCKWPgm~Nց>*Usɻ%  Q6Ϛ-0 cy |%0瀰Ţh3 [ +<~l:S"\5- j +WzT0j S8(#(Eq<߽&LJtpm_]8%O` zxY.'csze<{DYyzvaE$[)2BU6[qEE{*ԕRWzyԥV,A%@E}́#[,mgrTkW$aJG\.Mv+#phyVXI +3=DC:^eV i+U2l<5EgeP}Z>H& ~.ܕhJGRJ)3X&wFeKyH^yh{"@ti~aC.B@q^!- o҅q VW) @Ҷe=Ӧ1+hNaZ6*&4.zi5p0U#5et=aN 1KPoUu%0P89u+C\bJ4ʬK!m@ +\0 )j*^0% AjʹE^ E%g$候0ԩ,YQ2#깙~&1dqe$-gtxtԙ˘xg3. UKie˰:MFuZd$J$S"z78p}-_9B$%>d<2AeТ k|M1})Fl!FV-օnFiƨQZgk4Tj gո~N){)a.f\=OfM?0iU}A ,Uy.!nv6=ʵMM<Dvt/f'.a/IbyyBjH"FeR\Ҧ RFvtI8pĭnRU,'v )8}~M/'yd3N)A}&,;X +OaN0u&K;RPۜnPzC4c{S ұUo`v{FƧb8okB}!pyT5ARIT_FH-I:yy+k'/(Wi8,Al!/,6Gɲ_<+v7V17p]Aja}Jnq6ƶë7^NA]CHÿy& (6pK% Q繤fqh ZwQ:%"`V},40|Rwʸ dqǰpg:lɬRT`w/bUH#$X Ǣp B]aė'Tt/ci4Θfچn)F8 mwtcV8p+DdikBcv at|LROܓ́ĠB,"Z5rtDx|Ra+cc$]LJ>zǭ8 ~g*倏Ç~~VwC>H>ay~_!,!AY Ÿaҡ|j?N&!O(;@*[/ߏw3]a}>ö<y?c R w7}qMj,K8Mxc-U:z7Q5X{Q$hRb<]NBf"<5m5fAx OCpx[[_nGdfo2eVaâ4HfEa"Bbc1`v&>8k}5Eu`.m:a,7{S|:[,Vy_R'vMgn'MqDBt1,^_65 ;FV/D:L fr*Jd``;2#dI R" ' ɞl;)RvD(K0U&q O$WxW$4Y:q{BwM%96^Nq+3w`Q00LFu (3Վ!?H5 T}Su4>+UNW7Atj7TBjF6ԉH[{ =OJ]|0Q#q3\RWqM\lr3&$ܾa˃V;4̲(>bF!ڑ0R*f!!ki(Pǰ 4Z4K3C_I xCÊWrc{\¤+a4'n$Q6^JO?Q&fx|dh1Nn6)c)pޝ70ƁV?QBױi94koq,_5倀t+elEڛ~w%7V6^ IhOAaf[% },5 =3 5uUkY߷2$[VAk6\Q"a *-֛֭qAzF 8t 3`}pS:1;9ax}??y:$x<$m{^Q $7yr}y&d+Jl!T )K(@!dUE;oHepPYvUrұù~1.֎[  {Y^m^׹6 s~lv7A.犃dsp){[SR.`wDCyV/@KIR8pԔf;K$;]h 'J[cUL(k\5Nu/W]c$EY#x=Hm3s gC}QV͹ f/y>g[1Ľ^ 6ƿOG8癵ގD~˩k%3$#U6gHf@FȲCInۮ'a1p>.ue~uy]`+ @/ߙKLo~, <ر `& }1}BP^Td;#Jj5!&<I [Hmc UK)[GJ&8N2Dq5i\-4H3C|g11˹R45d \IB%%+[(ru԰:4'༷5RD`d0uy||ا<&171yM1)ZExhp( ˌ\`+S5\d B bׅ& JfZ,u^ sR|S0 &Q ShFr}e^^ NG/ӓ <`~·Aτ*LRxQvέlI$bnBцs4+Ez1ۗ<!= kͰ$5;4++vu&tF\6' ~A }^pի`Ԉa&FXΘ924qJ`^J LۇJa*q{W(igh5%oj7?Mm75βm+MpCKСI 2@JqҗRKAy&^b+P(N!޿{H9[?{:彞p?rQxy=(S"C\p%Ssc@;S.w6T=='yi3NQC$HO! R$AtyY]#:Ye[Hߤpw!f2IͶ}&I~Q8UP AT-=rzzꙷ^o\ѹ^} oIPG<el:FJyH騪t騖Wwf4EGU£H{6ך+[&t#)%><Qn!ZZ+zo?%m.$":tJ9p[8p?Qsigָ΋5mr.gp*j3Nu9$T9D@FmdDdnxm^v_>o_''%ZI&:Om^ \Ga 0DB ~ҳ|T فq~NWO .% d JQ?WA;ApWי`[Zܟ 絠fP?pevs;sL5}xtޖag 8/= IhJޣSۭU cllj F/<]HϬ{/l~|?߹%^iJ k3a#Y!L1Bbc1`4v&ބ-(kX+Z|\:"\cZ'L s]W"A{ǎb F  oҀp!|lMƣ,kޘ،;g AHiF TPCɰ)OXج"]5uIbp!Ϥ!Fn;A2Dr=\(bZyua>z;w}{v"tZEZHO|hT#+j3*2 j\nzQU EtQF/ <󳦵[MnuHșh%bvvSAU EtQFQLK~0VES[r"ZA8Bv\6hN9hcH(nZUO]JnuHș2DN5AXI+3ca2ڇ `,B썣bPe0,zr5ow7\vl-vKO|ÅM?e%.e3Xŷ]Nt{E\!o$ܟiZ59)W& ^-EZHﯖ=β}.w;^+dJ^_/֬Z+SyU{^H6oS1rǫZPUBB]Mفv.5VrxUcZ|=Sނ/L+KeͨvbHU8ZŴ $kw):xdIQF;w sILq껯lK*LuA"ZU:YRkK,zOԤP,UCQ2xWV BFY2 ᰑ@Zr ~EF1M<ШHG#8kD73Av;#UDPxZL\Os3X#4VzxM`@0QI5=$irF{kL&eeR5243A`l)VBQPFZ)Hp^0C;.@2F.adgS [*"~ws=H^ Pq,/fߥy5QtuDII4G(i(Vz HAAFdaSH$!D0|mBKJ]QD EUԟe>jq#c ODxmu(5y? AVFsVY s$F5pc0$hX?'׷L1`>L ?VD> ]%+Cްi KrKye}]a F~^8m6qH.iy+1n#7e 4度z{֌<Lx%?WNkڽI)Q0 &Sxl8Ls Zs)$^|҂C!*b׶O A1K zAGjBւ95be(![0L(.f")l6wן6a̯5a.9 (1M&-`f[Xcn/.eyפ(I:Vۖ9^̺$=!i-| sV&%--L:5Q@n2&F1|'nl'ro{V %p>QG&pTu$eqYו\ȺswÕ.*TvwIfHs&d1)Vi54sSm0m 6UX)C?>k9S<籗{DU)1󱶰6[I'`⵭ @namcՙ{9St{w s$֭aιlAw^N3ʚl-ӳZnfMxq$s)`og/CPxL=O<6qINp<8L$KܺFx鐣?t%rq 9u7aΙGżSTKNHY>r.ؼEx9 Sg)fhBp[˹G%~QǪvd0jb7jp.R+s0WӗzV&1iWDZ EuC%Y!hSI#I쐍i`b@s]̺ _k^]؊:G&2.4;bG=9+oG,dƭ U0C!3Cn;'<T"]ɻps%On+Kᡌƛ|ǡ} fi.oUi11Mz(ʞSPO`P`z ~z0`Ciho1U3i|41/OR/fׯ^a4#%!b` gDb⹑jxR#rRMmn?_풢 H+B#= = s% ASvIJ)XPknI% +nIVR]_쓖IIjTǿN*BT#5-V-/#]EʥV'7?JB@5-kXb`%ܧmz Pe:igVbCg[ؖm)%e0g}!}HAEF]3d c)KQ!X"([ Kll}VMbXfRm.ϘdUa$m%3Q#jW$kl 'P6",sq'fjB(-NaIB]_y˔ :(oȕ[0QBRPmC;@62JGƭ/lR#n+_94 KDSez{t5CT;JYB'~$m .md學ޝ$-!pkQΆv(mITzR;.a/g E=RHY+8ա[;;}<;(cչ eഐ[Z!̨=h#` 5s&*q6"-Ro'^HHuXL.gagbsiwh:r֋Oh-[opOVbFRtf)>,.rL܎z[?_U7|!^߯_:Z%`n~಻z3*GY[yу3r qqX,\h4 V9=w<heC.u =zpeǑBen;ʺί$Bq{dM&>,"h9鈥P5ȆJ3z *,QQN#|DH0\ZlpMg,ld^ˏfy02,4#IBQ0Sd$Jjh0f\Y␕)̰5qtLUpkS] UH be@ynwp%pR 1#ð\EmqTPCeއ=*ddPc0(-Q HG+s P9 $ ^)@m/JJdAWsfwWC3 #j:u;; n鞟-" {*\|q.(gԃ!%"Mo.!=3c`̌'ӥi6O~!rT˽1r!v)@f2qim.7I 66 <;6nHF0V^Ǔ a^N|b]nT gE"5W ߭˸P 1D\$}c_ŧJKw*$VbHt.UJ6D~;UH aZA]DAiڂy`-4 qL$N#1bD8l`$zmb٘tR Vn)J5)~32;"ޏf0\w<ҳPK?or5#IW 5ot' Fq}eW I ?is&_ƃ MTr tPZ1X IL <DV*=5aΉ׽<fWuvlc7Gl\|:|j8 hu`n0&2ITa)2|dž3ġ~oe?7 ]9fM "7\0eA0cv=tsׯ@ `)T"['4Qfӧ;RA08XD6{Yn ,%fӘ1jMf( hb.gZ8Bai C  /XwÛpU)jDp6lntUTBCh 5M e'G탵$E5Rhi 9ҕ=qj'!O"žA@G}) iR@Bo xnTTjM, j-3A SˉFa. ٍ'AY)i%|V46QD\e m g%JuP9r)(+IꄠR$ @q@h@QD%7X$T ' `>qTUZf?B2 ^Y%Z1 68 xblMjKD@ HCJg,$Hi{_+𬕌e(sP -gK|pq]^_ի/B*ٗӦmqc **"Ch< UGP#r(E죕"g0<)Z 6OiH>hES@Y!-CP 2$ Z7^iF!1Kr+jEܤmGôFX h+c3\Bs6}Bʃ _:~y_1 q;-$l_ݢcC%J(U0 BlI7_B7m_PCsݢGZh?Y-kO`A QА>|:o81(B cP׷L[9HHJO}Q1vp l \D"]paF:H^ Q W=Dbձ=C+ u],ّ*:bC(Pa6gDQec G T$z].]N%&hh"2zoPo,C<.E++t"1`0dΡ BwNW.N 9ƢECi<[kĜKo[NUEQ'\+bSs#C.,9B"Q<9/}(ՒWx u1I<v HmqG6Zs@!Тک::Vz^^gtX`LAnU2jy.(U6([?U9cҰ) ^J'҆* ځ\( "Qex$Z2CPK0Hf۫~$uܞ|֔,3[nwtk#s&*}}aD%bHyEԱAy*(cfb2I`( _e ZO]B Lke9cIuTV[ffHٕmܙ];m_镈wtYōHMZAFo\q"s{ wV^KaCU,D@ȹ_(L(>Hi+dFPr B&9%نʁ#*Rq;M1yfEin[R'ڔמA{ vK2ZP }ٯDZS(l)LRB}jjwLr9nE1k7=}H32+Q ]{e0b-`EIbk;oQjϝ0J6^q7w_j*>خ*>jTի0JhwӦcӪ{q9ӵӆATs/ Cb>" 9_ :8c蒕!yuF\ J zȵ;r-'ƣł9˃22xP`Zqqf`nV w6#Fm_}Fp-k}R0B:$,A&n{m9B,K eKܪ.XUy^u*KTsKb|(M2AI(t@1f)w3o$\D`Ң* $yQ8%A߽b7HVX>3NnV-W"Xq`?V1QX5+mQbWCaɌw@+ѣ&)RTBӠA#FOA-#f|颚2mL P0]L`zNRT) |yGhnV0p~(e e Rdev"6:Ԫy%p.0*tnb'FypENӐ& ؘYu`\d2`qh(-A20Fpm~n۝yoڝ^~_h%A#e?%! .Se}CvW%{2Цc;C Q/@~6[P<5Jwuq S^dAA;$ #}\<!az@XtT88[0kq2lu5}MZ54=Fqޔ\g7V#iiIiE"|xjͲƙ%z9ww?Ssf!{_<^aپyK\,r-oIX+(ywZU}ݏ?2c~oN2RH')x1 >9q!jãJO7eduY? #M5S$BEdeSO&%'):XeuLBZ9bp>bn * IQ+jiu5zWFbAq+w_=Qx"OU(o'dJ$AK9d" )f/jߞ*%qtI9Jo!]Gse>:{q6ҧo_*2ֿpy/2ߘʕtXr6ČwA\O)?~لrg=<3J@mڀWR'߫LKMoހ4Os:3eq0Lw|u¸6R.5hymPWwӀ@ 3J&t5匐+N?2ND,Q#4xB|LQ=0FrʂZI˭LA璉x8a1xw~k[\WzXиwW#\$%<R+Q Jk;ciQkUӌL @\+nuԑ|D sF%/KGB(2"!*7׮p9I6KޘZg;^Gi&G$gK!)`x܈1DևwJ1h,[.wNŊ45|k])Zs!p-)%'?:Inbwt:NT kqO҅>z?8Sp!ZJɩHMC]|^{P2o6+&͛UDc˛rסN%[}/@E˻B4Էx@j; ت!ms7[`ڔ~j~KC6p~^gtRƒ7B1#tMv$(3 dLpx˞sO)2ތ-&v5JHmD7[rQ*.?[ фd q~KF1տ^}AVUYn?]-n>\9>wFç) Wzog/GT!<,?W3s"BE""z!5U{J~7?\<Pg}#c9HR s0i+F[݌ɫ~%1w9SSVAݏ"ʩ __G3ogOy;W1ɖه/]^}ėmpzw53ŷ+*omⅤ_(jj9OD86֓Pyٚ Fz3 ,t3,"^j[ʈDaj[1QXWx>+0o 3]Mz,b5qTЂ G!QWوpj>ȩ cD,EW8jG"aQ0^POYH6>EOjdr!HQ 8ϫFDab$ :),d5V # yNj"j5F;!N"$AT:n`Qxu.Gh>'D,?8#W2K敆$T֡%8]֙Y6nb=/i=pm}<ïxR"K{:H|~XV}6Z|UP_^$mhi'J9vP*n-8䕳hOuOKjXo( S(q`Vߛ<=MZURz9SnR/1hB5r R6';sȕ3`tu3YblCb"I}ٰ止Ge'f;Mޏ`gCl{Mi]MJxe&eFgo[IËg翕xYxMeFZrJcΐsMgT"ӗ獭 b~.ܘ#RTrna VGnhBxŬR_֏.,S2̌5H]m]=I.*7ڊ/*7ON=~&ȼA{L:-4Mof &%;La>Ơp<9jb0p}Rgj\ n YZ}C`-׌FP 䄏!tK{a (c $' @;ƀ6GtVwY5JΦ_@};x5okBjkΤp"5S&,3Nj-[Rڱ&B4Jtﭠ6` l5f+`ype9QDJk';#yRADKM ^Z %ň#'Å6EHAi!@ЛngQ\I )9i*x@ GBQh#\=% T6-4:UZ8-Um/~X'H3^3~=Kwg򔼣Sn&O/KB+$Y"L=xj2¸y{g -݀}cS}q6|FօwO5|мAC-x{.+ݕ~$}u>ԢQjSDFtvۇV4fTDP0)mCTySyzt_Fl(o7rH!Gi.+ .Y & *Dǂ VG]`&g,ῸU~6T@>bkA*N ̥zP۪,TxvSg1:i1ho#ۅHniIhW7 Mc֐ 1a_1kl51#LJ2n-zxJ>1Fh36%Džr5$@`ѫP{nW\289Kvn8 +1^8x8yzcy2'!E=$+*kPĒIQ0^և$DbՔ5VN}dc !9q@' G b~H_}l?5 Jd[1-R:$KLw[JUkޙ˧i-D? gӏ\|)~l {H)~o@UW%0Ցb<#-} : HP֏~kwBts{aNۋ.mu: FVCNIJ>`mL;o07- CҸV>YSSb6Uj(ުK'M{:4A HS֫ОvTʹ;C0E'7 /f 8n{HaΚ:ZL19ъj[Զ{CNx|~M8ISI: ptiťCGR? oF]ɽI:}&K/iGaWoefvTvsucfsJ4Ͷ% 3/[RfP4䴍J \RI;c('jY7HM!OK|]*\}hh7Ytvrvqo-fJ!'Ǯ7_fl^y Fw1̨$SfW8\Y?M1ԙ~xthJ>+Vi;sPxKw06i4reg*$yuS |^(j'$ʗ*D["r/?|{ɛo|. 3` >V.!F4{}]6`G癋p+:y@^$I"j<-q?eBxʎlS_BGQ蹒S}D/l*&} cOq DÈ`?ͧE:_żv W#` 1<EAO(2_!zP4nuEӬA1t}`{E{A1:&-HIt~O,"͖ _:kOGn[ϠWCDzK|[,W χH=B)Ե^b r,6G1:NZb)jPu~cjZrnirD* /- B{y8 i:jyG \FoFQMbAe"VJPYcT &/6~<8W`#Z WF ߞ 94f}; R0u{lr?.fЁW٤RIUkl+;.#TGOpVJtȕ+ QIK5VDLEZi _whfj?d! 헪QfYҮQH- atՆ뻦PK^ۘVܤP0[A9a/JC 2j.?xSn;NFp2lހsE 1d7߼FpL7ؼ66|Y6_79ۗ;yn%ohyè4G.BmC=sj }!_F-46 xĴUFby?..핛]sKJmj >T݂ZYT|M!WI6-C)WL5TuU QI'uYX鞛d ^dɊdQ,Y~w~}oiB<{wBp,]OX+SY;_z [ïP }_.?_HR Ꮗ/+%d.m? Fz5/ٹ^q_\o^]nٰRo;ٰ_y,ZSRZWMUg?wܷ όZ*sfL*)3Df3zgF y,ZSsn E\AQ,.xqF-Y:^\vk1(:+6},t'j7Zl5W΢5<);1Kk|5"_ o]\?Xv:2W6SfO^ge<}Ǖ. FTNZ@ \7S%` J _=Ӕ%` ZJ>g€.; -2hn$է2[Oaqg$4O3tDhq^3f(s ' FR T$'ÖKOh+ Fn"Da&Ʃh/\ ! x)NL&$:vcL5>t=ڹ5x@} n~^*VĦmd&\mYrVf޵57n#뿢9٬l/I6ٙL/IHhW\83~,QWRIJM%"4F5>e`%V2dJ  !Isڣ:?EjB`. ?MǽU꾬@t%, ׺稃 ]/H3+4V1Xup]Z074&Fh̸ac (B#"6BpeqIxZ7>Fl7|q 84~r8J#T`aP,`j8 =W4< FpX)8253f)4Me`z:xhf}v햢9&Hi v`<+Ú|)|_ ׇ) q > [Ow,ߎ@:|[ŒJ*Bߓﻹ&+ލ]'$m!ܛVp?_>1L,B]?n? ?\79X?φCx !ۜ qu[ bl3v/}qKU+2<uF:W{X\h='w |ld%uR&~셽 -ڋaRozW/{)Ky^CK4٨A=Z` 0YRd4hGn5s oH4G t&?}>Xߠtz*=>4Clmś3Q~vH76>haKQX ۜ6g㻭i{@a%WYAl𩶜eie^s=ڐ>|)xW˟%\,=zpuRnQWu_eYCpmv7wSMUm-X νi ^Wמ!~!Y v ߒז-wwVp @x?>k3%7dzVf07;oa0{vtNh0'z?z:w'G''j)آs}d]V khi;}A"ݵu֢| SMM| `'e7kx:Z9CvpwR .>:T˨FXV-ӌXBbؒGms&NʜS (663#r( X|XmPxckUD6"rR4"VR't R(;PbQ3q.3L FܬU&> Zg5h1sZ70v-6tԔIᨏ|>`1ܗg<, GD/;0oQ AZ@ !w rwZŽ6cWyaab+ XL9 vTj(ʤWh ARPIi41q\xt ZF|s2{֏w=6nlHдlcu e㍳E<|vԜ̃OU^Ǥxz1&MՊGg?' N9^+%/d l GP)"&4* 5ҷ'zl\H+8ndXW/J|$℅B1( C #RtԕLQyC4dL qnY Š mv:g8%k}n{%[#Ѡ1EhvAjex6h\z 5!x&HA!eX;x9aN)O= |pj9vwטB\6<{`- )?YIPeĕr/Ó{ȅ*w~(L'7FM2tndS!ThNHEցܣ<-\,H{&Qgґ-1ì*FaWN3s {?43+; d5zv&xpkZ$q>Hx)һ@Um a| nOzs~<O6NҴ=fiy!>+_2ޚT DWJD-|'BK*51j3CEReЇlIQfNm##J;U_ʩBJé\p-|ْŔd=%6ayRJRŮP0 TTCS B5APY/oa? u^o;? qjBAĴ6:ު}..NT?QF U\ tq\5*^2f]IouLO6H UO me=] obp qVs1n\i>0UXnoCҒ芇38L ˾ K)E3:e*niD1&guqEY#Lq TIDIQ6n$;>MT%s!}-ʼnH/{Ҋ z'Ǖa "ѮP, 6Ti&bb#(!"YLT%FgQdpDFX(-ޮlݑK,2!ai#X!bab1#qlLD؈Ȱ(E%! dgr_Eh-EXR w^_;_ b d<3%9J[==RaT7=G;yq^GRъRI'p\R)kWEN2pS};X-?TrI'nc;hLnaIRpsos'[m-24N!W,\V=Ϝ`-o-MY~<ܭ.p]Ѱ3}4OC @L oJcRS(Y}; FSTJu8Op0: s\pcZ1a᪲z~ңZ) .6qV ` /ۗ̓rۼz@G56+PAliMD}Ӓy&4$h'ܗ s PT[@  P+GUy+&< ;Z], NyQ. )0;PbQ£3qĘaچa'rW'`H==^B׎j֮;%!V>/&a3,J0ΰ"Zm58Dp,8ֺl _ec1M#DL!-=ʢDYnX$&L:# b&>(Hy ֊R3hui+#9Ls%?tr )(SdeBޯ rS4?&aY X`)CcJf** P 䞲<~y }ᗏ ɤ8|s[NLO'BUmPX8 h5[XRL:yt$xrkP)s$r%=83߇w~[j<|l//cIƣ(kzbdִw?{=^l =E"" "]̀ӻ_z%Kt9,LglOt|zq Dtmwl~ Ѻmw^t3M΢m4oݛ?e*̫62}nf}Og{Ns|dϞyM0,\CO)}ϫV1uAwѓ %<>\R9Ɇ_5#ѐ1E:ڍvBA!t/N~-|߽vb!2p.t% 5˕ظureZRsY.kBL\L勑c Uv L=FUҋ҉(Z6vt]].VUmU 7HUM!p'0j"ej`n⣨wX7 /j>6T Id_ 3$yĕRlInfJ4nAqlce Y!Ct».3?N|`'|YD0PĆksi$"Kp̬oSm%_kz] C:/%fw\ƾΗyCxYMe5.&^5~ۃzmbN,LXbVebCG\Uű!k[/[Wjo(-e-u.+^}Tu\5*J.2Ec93R7~{4^/x~Anc`$UՕ.~N)Je;b\QKYBx6U^1ΥbkUsᾋ[F[=wiZ8K >՜;QBjDVqi1>%14Bka_E0J|$Fyv B  †$"6<\TA8Q8&QHD;NdDp)"EkK0b%:ќbH#f9OqXltL$" zBYMb㌬ 6Do>M f2ΣoG-//LyKO&04_ch:RR> ;vp}oI*8էDD^,]Ǔ|| mŽOO7h2noFs~ pl@BSםo1&1'#Kt㖗rA8 8LO^".F9cQ`EBnjP֝Xӎ}HuKQBԶJ"?|:miU! ;yS1Du-hKTp4'}Z1{+K1ػ6$WzP<؝B` ![1} d5/%@Y_feefee K[m " c)hd. fOO`[ꆓs9Ӗ4`# YDg=: q#f&NjKLCO py 4cp%){JO[i$# $H"0D;r drD"j[VI|ӆp LSW/Ws `҂B(fvZ $sY1Z9߁˷&<< Mֻy$3 <8Ra)ɭ]O O%R! <9)Ծ%D&!P+\Jϵ2x CI̡V  }L6s-Z-- 7RH & Cg̊I. .~# e @T* 0x ZD *Nx?ف`*?Way6*[] i3=:1=0[_\ﯩA_6OۥpeG5w#fA~;n_6mCoSġjC\^;/[`tqqk}i{=f=U2Q>aS@|xylo/ lw}Ժ5&[SFaPkSpdtkWw6z2h PFQuxa5B0mw7k;t,'ϾRnx|ix|2lDA$kBcĶĺ?~`I; 93ȏΖcF.֬i˜PxxUG Fm9%"}xiHS;mHAmmP$CA򭏅Aj$j͙)uI@^;jh9,cs՘cw^֓ :`R9waLܣDK=ЧFz{&_̸&q,sK1*n{VsiKojB92NH9l6׬Zݳ- ׀kg\Gx5 v:z͆s36ED|N!ё-GH #%DLzA> $g9gU!$R kE@;ƭJ NPv`#7;[0ȇVXפ[T]t(|uК 7 9XH2>3͖@:]d86׺$(vTj,GRkS;4X ]&Xcn{gPM%un݊:u, ᬰ̜iq$ENc >:|½fA39e('Q8!σJU6PЂ!j7ٙK*)(q|ϮA5:t` R}L\%`dmP{h>doRyK>J;(mR?Rϊٶ3Yl}fq7׋hVz{?׫K݇z7|roEAw5#)ěpZFcFVRWJK]^0a[aiF>ݚecO0/J %MHFBVVnZ$!\D3dJ!Foi7+ŀnN;hm6Q>f #<F's}Y]W% ޾;coyi5|Z<^ {*TNi^_DX"lʨ#gA{7XUl_߾*6Ms0{zbf.L|(z?y,kΞ:jD:Pf+Ahmȷ75"rN9TW:YBSQ)%M^)6[MO[ݵ[t6xXpQ0-@ WB wA4BPֹ<,_vpDqSc8ei)D%V"m0kb 2)P)R67)&gknzu.g%قg vOI-ɂ-3E߼~-~( ؙ°{y߄Z%k۞2!!<"(xA}l[mC8-DNt#uE(>'w] 4;Ӏf>[cww' bɵB-+O`MJLf]M|pDxa Jn[skūez&H;aNӉEi`YdBy0N摷ك U) v!$TMKLc[n'IwS,MHρw2k9$% bfmgn-]ˌg2Wdy90A\t1qI85^PF5ݝ@y1uBc;pZ`Ӹp/Fl?8fd[ Uل4/h60MFsp9;8%bp>DYSrf*=;L 4-ON<3Z-m{1Ӡ׊t}%h$s"$fJʝz$jɰ+F)W(<b❐R\ǷZ-˝k'TZ"^*6.1XBB!%A^4C"h$3҈a"=pg5]`BOv&CkK"wT 1tE"b* ;xlԷM/E05UVj cm؇Xל1uؾoaY|]Utj#5a Tp܀pX D cxTUK$A>a$wZhaY!0`>`C 8aQuIRxTH`qͅjϟtg)p +bwJN I&!ɃNctDtj̣(q.Z_؜ 8]> .ZYp7w9 ]YF+~٘#d#f GEMrxȒ'7 !Q/+ʃH"9aB$DGDDxD, aL:x_i"Ұ\%"=a8(ٝ1(=%}Ia6fH V+r\٨:Vkݿ+<*V(Q`)SJ(Z

XUBFѩLGd?|L[2VqȽ.#JYjD* h<䃴X8a-).No*Ij|SA EFMnzH8ͤ8xsvIG0}Im=Pb|%S6wgjOɵǜ2rM -*C6ʭ=WQHz 83/Hʩ2sjjUM,KE` *mrL7֢u/X =mۃPfܹp ۸ T/}Ϊzay4m(l(قS0Rtԛƺ~mr0)z~dKn8x:b@,-ӏ|3Y7_nS? |%8a7/Mk%H:aS+ߛXrDzB:-^QQ0IffFb¾-'fPcS`1ؤHT'qۿ{Mbu[dh=lг $ FG8~Oc~%r(}28)%>}6`tL48ݳID$I&>Gq\6~Ȕ) <˯<9+^S9Q4h$ # m*_cxLl:4^eؙ|1>-|=nM7?y{qm9#)2Xyf5[-%0˹hA˘+z}Jطf+fnڠ|.!`=ߦRUQ >( 9gJg"f:%9An2 MSy`SXy#;Tٙ7rgިk}@<rZ/wtPJgE&\5&7B.`MX/҇O2҂!9#P>>5s9$B lUw۪&WqR>ҳ Œ}]Vg̛AWXhB.bn׽tOaWxzM~]kWo G%][op-Ca|zЬ?3tHi=55MGy?~ :ǼE4ij5ͤH04=uXRnbbtJ͌\5h.#M[' X\"Q/ zeb\`P2,hiPysƠl. J]?Sc gr\5 -fUi!K((~ι$i-QuFjPńUWFr c&cD-!~ \nN92Yc*1^> ?BuŸ0)_ۻUÞfO/nn ]֮ܣnW:RW8~,:{&ئRZU`XR *`b  ιfu93e%k:MT\|j)\BDFdy 4LeQ%9iK!_>˃{J"rΥboT KC p[HYE(S*EivjXSY̑taUˍ<:VŰSQc+b}Bt}j( Ÿ5ʧ*s[o,%20\Qp&f2heư`q-8sOX! d NorqIPMPv) 7TS Ԝ`5Gz&Qjѳ'h8ɣg|"D' |3 `5D k.o (R4`=]ƘZL2 ^`ӐDo[7Ǟe%BK1iSwin'd8Hgwe՜(Q v]StA+M hAoߴ ո,H!T0%o/s֭̕x\2WV`R&:C) ['@%O.wX-H <"Crq(|bԴw R:nSCJ6ik$h a)% ΪJo:y͜j{+[-_TE<M>6O~}ݓFhl̼QpjHwF1ariΡDSyN4Dy* (1YQcgN 甲oF+PH1CZ9,T ֒ X 5 \|ƺ" Y #$)hH2),KmG*)iM"1pI7NQŔcTrE:7Co:DjC S r\ Ȥ5_@ݢ;{y^H'M),Q.gp>U`-[&+oyX?L]`kjAu@}2e($8ƩGQt䰁$}H)#I-(UJnQ$::Ob Npel/"{'?yhÚJ)d&è̓2{2\,`U`J!ӄPR(т KDj!4;]fJgtyH,TH=41Ւ- ycg<3ep%‚sԲ;;vuQځGy4r*ƆW iy}sjNQ /Ȟ |표qk"q*{ʲ7)Ϋ>"!lM A7pd MEHnҜ#) )$OTҚ&gn*HaAgt1W`|/?%6JW{byP%JAۻ R#boD./4j=N ""B8raEO7`اJa,5-zƳ 3Jo (` Bo>{|WX>V0UOxZ( eyVُ̫C(J OL%Ts+M&R lG,rnwd* Tgv;v '#Bal(Թ/o2Fnųg0SS9GǡADG>Ls0CG% nxt{ _8`NC%Rx|HkܜHw ?~d1Tk::]=UѧLTȵm'ͥvZhc] ڶXe y5.*}dB{ĖclmԼmھ}8B?z'[nbԕ@RUY}딭_\KƵ QꦔC \lOJ:5+mFEЗ}l{x)3 `ȗ Yv,i&(vے,P`.!UE1V jEîb j (]Cm"HIPUs^ / ]=Rvk +5W.V%F y>jc<8J$dzM AA pq1RKچ8V{"'8!Rc<خRޏCUNI[ԡq?pՁNqֆUD?ѿKa`jl(q2=[O᧵)1UX2 4DΘQ:stև5yi~{:]:s^7bGo@SP((@1 :NW.*C'*z]w9@Ό4tI0(`:)7%Ё֔&P>o܀h9Us)IJZɞX[rq1O ћZ5Ҝh>bf:ݱLcD6/N,xMP ۹\;w>M3%xu[53޼rFZBB)~ jMVSl۪# ]&:Nt ?ΙDݥ?,^|{!X/$=6=p݇%aTO kAYwygY>ݹ,FCy0J0"UZrstGYy1T>{B' |u-mڶ9B,ylQ*8j5G[*.p=3tg4[9%e,s]rJ 2w嵘IƦ~kg& $kqծkq=o AƵ HKѪQbw1.G.xkvܭIbGy.ҷ Cr$D)iJtWZV8ۚLh^ⵆHD>Ҩ}X2%.SX2%.ar),"X%F^RYN9F.R/%CHH] jgg[|$ ,?@%,!g9IyFG'^ӖH1#bfG v=a5Jz.}ԌnjOsjGם58Ƙ %v:R7j'|aJQKيR)IT\ 細|kPez9-l<<9C FJORkr{faQdZlPqin2x*穤7'ZG$DaGsZXXOvT 3, +X@| -[>z\cKEm Q"1B88%P_EsLF( K ;h35W\[c9iaGA yd9 ZB-]`09*Лt8^ +*(P2;.2y/;xF)mG; wV+遟/c;8o\ 6|xp!fJWLt7P,,3s}{1$V/?3on:[ 1xs)JF÷_H$Kp06ٷ/p@z(!b?'h+ulS{P i@t 21{x75B+c c0(nh",Q6d c\<.\2Sd “ÌRߋkx3Q\k&LD$B&W,`o~9nN;NG|HTЪ,ɇ4-!}KM^m~Y֐]H[xIlUw# I+,Բ]grpG$=LDpMa$ SHV( n^"O~-S`qYlAqĜpnf h;^U`fr~Mzi:˟RB. S5(I׈C)B7le«IB4xMI?&6* /5`B#aȔE9J- U[%Tŏς3A%zTU|kS"@Q*` CmmUuZ5,f)!#.&a2e)+Y9F0iӧ,peZ!9]T>{%b='6H r:mχLb)Zmj'hm/[OHG)|B|gC0V @y%$,,2ZBAꈅQ;kV_^2B~E^q":V_U0x#GI|F7gߤWe DK(5.BJF =]IêW+4SIUq~GC &{ O52`9QXw'6! ǤHXhyb@U AX987;u]|N,.]-yxN>o$Oa\pt̩C3xFHyؙX̟$Y\lyQ! .8XmB)iMM@ JDBMIOWR ]eykVEel$ƎVp,J0/7qdzmgIeyj(9Ek_j3œ Azz;,)[U{ rUa1/BD_Ll\ FtHwπr\3NC36g 0*❚Qsac;nG-8@V#룥yfe1-)!=Ʋ +5%EB%ZMp$-GGO%\~^ks.(}01JU d<Ӛ!e=/?J4vB@@wGT^ӓ@ר#ᬣ/8gX>!Yh;^7.|ebj^/7~\6eB}1Yve4iSM^2Rk(Ί)4E6o8r+w0`-ܶ<+xo{ںMsZnQ[!ں'#)|yZQShV(a(NT7O!$ޝ?YHU~FM'?Y1!5n qIhQןlW a~l9ێ`L64JcIնF5S38^AN%,:Oa2Ǽb/<ßx:m.%|>=rđp؛Vj4U8 EȻ PTi#(Ax;6J78UW ub m]1KjM8oݶ՚\KQ.W=+ RD˷-,;)U\A(})dfDM²Q -U&r+;ladfW u#k1,fW:;bVt9uoqEӖw*ҰcrmN-ZBAqe"2R刕Z)qw3ѩCmcEX, 9ErOCs@Gj,fa9lޅ#XJ+WZQPsƟs'=d$Ȁ&`ha"NY" Fa$k@po ;jWxռuÄ3-YgOIZCp1J\ E-|`J3uKZ 09}Gp[;ZCL"˜ҞqZ*,qj:ƨ W^럭Wrs!B:||xt7waϣkp-<(.+Fc Vx "] S\,~ :_*gloG*;7+~O&|pl&?~{p}5RIA7R5\zZ}qCq1T(,;PLP,Tʎ"U;o؞@ ""!=hK# Y4Vź/(}څcK׷ K v9H6Mf8LpﮍfR9Ǟm~j0145@B\[HϞ\oߏ`!'1"ˆ߅|v;] Wwз3?Z f<|Gf<'Γēf:ΧV-Ugq {\?8b?,3q4M=ed/QhJ *Z ebEEɕ4 Lb啰?| LvAM-.8#OEZub8~`c5ɲ_!FaO=w ˲Gd-lp`d0@g7`-LE" #T)/rmD0C= z V`Z'h8ū6\Zu8=XHha<_8w^3zjKK<<^Dz?,~!*q6n.ϝitb{k4Y-}zd!LlsYዧxԎ _77>\ԁ`D)~Z:teʄ8kmHŘYyV} 0xvd3l h63Ƒ/dOfϟ?բ,Qi5$Ee lYꯪg&4<2h&5dPq2 B<c YџD ~k=>|\[%׉heWf'B*ٍ8~ښ QHR#K 5;ĠtӃѯ{)p3ÁpydV2j|"r " z/0%z٠jm=&?#Ff 4&@Sw׊RddaX:eC\DPO|_ F r]/Y}t_ Be4A4sb4>BŚ{1`Z9DgxXcTT-b ZHj h^qPZKǚJ!@evZ_wix-) 1eO'H Q ՝g d* ʨ@1|T*2 X2r0++E VsŠoq-߈9/U",HS0.B Be: +^mOqfSPHIQB`+/Ss4 ih@-rqE+Ҁ3|̧cn^6!ފCrr<m`8!HQ "ω99%tRT)Yf HHUzvI$N誇\R^q:bZ&ѯ=7˜B&- T.Ut~sz}džI jwXPu?h9yw3R/. i):)%bJx*G\Ȅqc,g{vPIs ]z&#^"Zp&@#hJHnz($CL$]NL52.sU(4""; P9m/ z1ELttCPm~W!I[sS2D { b!c՞9^#o@K F.n8 %M ZIP&OYi!XU\R [mZ:^P/R#cF(U9]pSi)S2lFP 0N^ WR&ch׊`Hf᭽)M) I3NAt2=M}FАT0°M BfbV50 V@A-FИܠ ![ u7UxAC/TN=G<rH_-͸tRb"j#נ)H6nIfG8PǃEv$'-V|R"o.2k@.s[Fҽ![/(kD1C&Nݶ4TTה!Ŷлb {;|[_\(]"?>,|܇{J, kf\^|{S//O>^ߖo5 __u>IaPsI+ }Yw1$˗h\k'|פkS1z.Tg_=֓0QfQ~:3;GF&VuՇ?ǰrs $֮ .=hZvWSX-VhH_QOj N0uo(:reb~㙢vH}*3HP@d V4KG}ЫvhisVhu -yJJhuZI=Zw0yJMPm6D8<'Ts#tÜ[zÏ =u9( {[o+=W a ue?y5weQ`%_w ۹ÿ4G3sU0jcA(kUi(4()6<[h9<̗2ְ\cܩ=J8F)=ź\G?HBrM)#GZi7-I}FvB @-it Ru!!_&g[H ^uә W(n5eMH ZEl+Ua=VJ wP-*(YOS4l£wNcv~x5h6]~.gl~p퍿]pl1tlYxM!b$VFf_< jxarc5q&)\ -JJa/ޭ, .oo^e. 9 ^ZMZJ!*O:$bMN.dcw e`@ s C@m/~v]| n_nQ|b9>;FZR0gplsAa'92:ixVg x꘻7b"[wȎvQo;؆ v(;M3Gf)Fx\&|1Or;8~H<m, =aU,2"A_ :V}4/K!5@ˬ /~?|iE 1/j~Y KpC}Q@yǠx,c.{{͗Z2z2/fV^-wNӛW:{z:zJ*WO""ᬘ[P ґLBʂi($l@ Zָjr'3y&Z(܁H24D&\2EA0)<[5"*w4gXN ]-B1KzY!Tfss%(sf#ѡCcW "K.*~]TRtKQ)hJPaʉ2R\$/+mMSNcЅS;Am'B|bYHe=I* -dgyTYc30&Se]3.C/AsM.[A`h ,ܰvNfN&mT$v OT{mI,> :nP1zTeR+pӗ'jabhM(.cv||.c+lRa;wV[SΆ+z+eE1CW8RS4=i\_XUa8dU/|O_iZ){AJtAXOFWM9j(gFiYt8u^д :(h:$)hڂC8wϴhA$8rkq,=b1eN党|^5\vne cEKuZ+m7 ԧ+f:Ec3-QБ(K]êVdFywߊLz,CԆ 0 3-&Fвvm M-{^d2$BS4GŎTSFye)[dSBҝ&Ɣ4נ< & @- m]oΜeD[y ɝ{0Zhgr{O63ѤZlS6|ٕѥ N338}A*RZja oka:skrGAvhr'x8Չ9  @A&"p$< 9T/y^y\~ͫO~-˗Ky |OHfD('H;)Έs՗XK'<'?.ʲP3Daqo/\ܪ~% w6Ѥ`@AUsr{g8f[B0矯d߰32*obR̅;T2.\h0-9uGoxƨ!ע7y:?f Ȏ}A_Ǐnj3BcV!nl㗮U&J\pxuqlXIFW `-8:kgݵsxƣs^r>F^>NijGw >!ZuѶ@+2?Pnxr6|DYyUz#E7Rӧ#@l׌_:v? >c]O<͌Fj餷R||m^G/'xbR9-4ܹc|P皏j&}oAX'1F%o3Fl/@{_Q EiSzԎFl!;µrȢ.nrȖO}[S/ &DQh f(tzp٭'c(+炱54c cxKa\ 7$ND  F>6]?83Ee*U+ _"C^:a <漢ʏ멷]Ъ;:o?-! ہC҄wp Y?G5xC=g\+pyNd 娕*^>y3xY4ɸ6k"{-jYP NE&\w?P?&1;9 Q8A0+R\e Wev21 񫢎[fBD̄d_pa6 EP]:MB)OS!ߠvN8sVgxM=pqAw4ՀjjUInjF}\|X(fw7!XҸ~6{jv‹YXWpu{s?DvwӼt3k2OUn+B*R$|#pϗo5_{FJJTW`_<=ieSbKE9+| +:2IW.I2U (unPCPb#:hNh0")LG#[hLUv`$JT-_u˙@IGùxg˚U qLHxv<\jB/CG@ɦ,EwWW="F{k?H%kf0 ?'UxgAɦHf#  Ҁz~o6 Je_}Av>x{o>X V:3")D Aw ?ul}[x x"$&TI =!-p8زn\ěJ我3zY<+S,#ݐpeDf_ jarc)pPeO%*#Y@Xک"`тZ+pkD1)6~P۶닷"jOؿ 5-w_ &'5bo7{WmgdZl#ĀC}JЭ5w8z}ϯ]p=RDE5 cL>xeyOM 7(n,rMZJNQ*{K|,SƛѥF=E/Qf6NyjhA=co'1!?&?5yքYgMxԄ7Oc *)J/XcV<]\^.ٔv8a.6xs.cV{N %h k ]_RvZȫ3p\GtA dII{o8\g70-p:ɽ5'C%6p3(zog <[?oZM劫BX('vGv'` 5>J)QNJ#$')L;Lʵv*Z;|BB?cS8i+/20<sK(9UA41 $x4#j[h0m'拓z-O*Ȩ[Brf!inYFjCWZ=Yƭ$P~y\s+Dty!*R R~4>d25ᆲv*$` 7'韯OrYvɼRݜ!q4WXw|/={߻\GuϟYjϧz?5L4;D+#8k|09&tt"~\RM')O~!n~N2oPOsx~3kh˒b$g<|{(5TrmЌdf}_G׼T t0%e 6Her27JsFzqͥhRh /5HQhVV߬ۜEfa?]r>?Fxx^ځ7z7#@GosEt} }?OwWOSc2qW)ĩ7vvw{s.ήk7ycݏǢ☵aV^X1V/ +0kdog ),~}ãyкhިW݌MA Tn;]Q5p8?.ĉHb;PϕޘhdE/S-uBHk6p PY ΓC(_ЦLYqmj56{n/'J^H({Q҂˽!gyrM2ipMG-5VJ5U<1ڛ?gtmQMﮣh#^h.{_]-ӱ˻WϬAmwdGdq%ldbY)&1T ?Gzf+yW^wvہ#8@W]轢BfxKLD!&THϚuޙ\ D6:j aFx[ϩTQS%ɠR;w!F5Ksj (ɏ(ƅ%^(ẍ́HT;0Z2xJ'[O$y!FP ޵SY-wiy'B4C]GS_"KwKҌA=h('RۓfNn~<ՙs0{VC o}mo 0n_q!MOH=kW^1 9x{܂qd,QCGmR̰ȒO*(x $F2h%aWNWP0`g/> =VZqWx07u~І[3w(I uqNѣX眢 5k6j 2\6z6Zw]2[[tŠ}OwLpya{47SEx)uqNMX|;G)gғn+g׬:\DkdJŽ݌0CnNm| ?V=Rօ|"LVTtvG2]6;hpq$۫ [s׳罁x>rrOfd>qd("y J.cn.`݉Gl>~t巈 8(sh WĖZ6ZIJ\A1Ӡ*E-\Z>wu'5d(=؊jzN[V}hFI޺%=^D4CA\ڊ&>BrQz_쟍k[kv@K9UX+h m8mZ&Uj_)J9j])OE"irͩ2=BaS$Bzއs(I%䋮t<97l,̭ G`"-+1FF8@ #sk礕9ѰIpW|$kq] !pq/1+NzBD8h"^#*jOrF iw/ 4aPn (Ai-*8{ ,)sjEvI5cՇh/'~oqԆZƕ:j9qM$Xߝ,R<|H^A}CR޿up>`P9TGcЁ ZnlIUz3~wYA:nr}8ɇ%S?14,B.?KiyO]^mvr?}@F7{1K3ěopL?t~(~"[|,"(-.)08% =Kצ+M td1𷫡vIA=`U]L-倥G4p5ZCQHDlc!x"`Y? |e=rx.NDhO U"\(ň2O"7h%EW1!)9ҠQLv:ל-!XcJ`~N ݃ڢ[<++-zݸ+fZ\ Mڇ @4 g&xJ|mH  (ָJȤڭ0o5w6tTYZF+ŠVkm R5rʵphLefX_/[Px c<4*xG3fXm @eoNLT=6ڙōfO߷})r5۾<\jFFGZpTI`+])k%-TΡYM52Qn `Hu>4fDI0SSBTx^Cv~`J6f2^C\*u.H7 Os5B^SJ̈́EXc5R1(d 3 ǹwցpJFZ{ݪ%*ʓ &::.) EAQ[ Z)bSTQi>#/m$!/4y9ȡU#j  ;l-LE5ޤe/~FTyك7ϔl[&,n\rCֆy iyvX36d[]4֜2QhԃP*$OᖉwUYdBK (lǂr?ē 6H5ڈ*+^Y} A8Z%1+ay ' *X A[#YuYK Ɣ$g(`TzF;D0Vd^BJB v!h}vM܀Nĸ",qKYJ$G\I D *1Ot6A4HGI'JP@4tBEy$)rKtM'5SZRTA 5P PUj}*PN^}_Ii 7nY/ ]9; IX fvv_v`P4v|Nb!n[Dc%AjU"qj8,DϮ5"kN57Tء`Wr6 CCR_ +ߎPXppz+SnϦo&,d%%iL凵ip8R)+gLlW0w]~ZS֯I m̐ eLi$=m!! H \<甅-&o9eLi]g*mexMGyjVw8 Q[+#hx<7wcưzp-֋Cr{hQ?\e>͋Ր@IihCYWWJ; ?Xsaha5-[ӴfFcD}b@Dco#o~{[ M+8G3Ȇr P9je2`wOV6E9o=lKJ)+g1>1gA=*FY-3.p \FYUaAKUB#.BpSB K?#ڞ~nȣh;K-xJu::&Nd-"lM%E:R3y#cI`!wOLPukjFdqT.=: .l: ը^Fi5]jW_Rx@tFGXxQ TC%HdDF~A^ѕvȋy4Ns%d]K̂KVaIt`wHns<޶n-sGw)%(zYw^.Sz$%pi_jA$b1}F&vN;2ᥰ7n16@N~z߻!;ȍb11gnYB!B,PI*JZISRKrr* z hE+g$2Y U!^68Gi, J1-Eɉ¥ĥBR ya]H-ƢL ֶlʛޙMˍ} ThLo*uN9VaJdiM9ͫ9%yya?ihɤBBD&tudV~sj7VEo8Gg7&k,T${!^B| v.? k-ax0Nٻ #nxƔ!N3ЪK4ٺk0#-@غnK3tq(#FDlm4Jje8 :E ȞZN$C*--!"W`c_F jGTrAWC#xg01gE+e+±5p.52+BH+Ei!ATV4`å}2nbB{q']Q2EuP2ڴ߅ Mi_3<`mzqr\/&Ţujу~ݓ^Yﶏ~怠|wgǼ<ՙqH&R1m5I)BWT2ЂjQ9CBЊ(+.0΍gSt^JpbRyV9&#s pP.r\H 'Ei5ƀɍ&*rrMc[حvy9w°]K1wlh[oGn/?@VBkh??}|wj}%ma֣}?3_K\:͍]~?E37zö>3[~zwgwfu`h_>}q[W D?ǚ"uӴ(S w[(Bx2 j###zh]t*xEb+ FYT†ۭ#9[qPe*30rY kD+ijR {EUjtR+%Q0oiUl2gg1f@jHBߔW}a(ΗOj'1DC 8E)6tv|V'w nA yF>GƼjFF:9eۼx ׊_GR!ya[ᯠD戠&$=zyt X1qcR'w^qj D6㟐PfUx :K Xb4ޑSg$!ƾI]r9ADQ&YAŠT\EUTP$\VJZqδOԷKQMe@v~4GAAb$ciBM 1BA%вB` /CrT/$^(ѓJtpN[g)M'Q>pNY[]HX1bvqCš5F3@y)AhZ* ӔIQ*m0&$@6m3㷐Y! m#v/YzR>fq4{zMrSBe4,'ʐ(K`F)F78\d߽+EK ʔ4Hl[h*/8bk-VWi&D8 ]˜EZ#I8TJ ֺr+ARL˜ƠKxl'#6TEP!L`;cRFSk.@`\p}:7Rp}".}|˙&kR L{șfΙ#қY/LKǚ2NMZJIx`=k[`/fbR%Յ]2Cc JPP6 S'8V+)P`12;7K!F:ndF oM }# &Uq#{#:].D8MCnYE/vtĀ@SF/\~=^-\W6ޭRdz*'*D 'h@6:\H,q6Ȯ>g~Y6]U"-?.Y/hX@fj WI Br$n!]R`e 1=Bn=]~iDoH!q.oebw)C+8 ڛϽ@w-5)t:}ӷ7)C'3τɢm0O\{Idt]=m#-gktqǸ8_4gKm)Q.NjNq$`BB/Tg!@79n\[,!6mHT[gwKa!oDclJ wsDIFn9 -q)-Jh=#tG1 x~瓎 N)+s7~IGR1HL8a/N9NH9:^1 q'$ )r~qq'$ v_s&le'9NH;xg'L'qBґ;8PD8aJ V9Nㄔ#ARTe=0 D]!nd6b ĩ b?`bI+/>$1FV)ůo_6;vE`k>ofrr^ߖVR؋Oj|wfXCPFWA#_N#$*32%%аhJxh*%͋<(G((ʆ/[|mFkS'&ߤ77y E^+'{V+zOpۈjcmgTuSm|M{:M~7XDRujٝ^.7#q°ZA*2#>HLqVfB*Bp\V(êGr5S?npw Bt ~;![oxO"My =G9ESL)uЮedEd̔Hwly3Sdi{zh. xr k 8Q@$޵cȗP#@> bqw/;(jqLtn;c˶P&%ˍmjYm@ 1I G<9ghN+S2$ O hS',`'OfY| U>~tp}YmūdZZ}%U"~&m=pu!L<~_\Z6g]QKh=XՏYngs[817Ӎ_N7~9tS]N78.PkC 4/ uTVhYha)#\J3 ,v+O+6BG] [iv~Uip6d&z̝|G[<٧*[(SW8q2sK+-2Q 7wXh̗DpzYĎ& 'Bk!HeFư븯_55|48׆ jݎ)r, rkRJh16ːwT6Ѱ&9Ar!f:!؁7DA;Czrq~#*n{frrC޶pSr}qWS ۂ=SfO_Oz2qtJZ2$`_ݩ4ywFАeu XxP>x8;7Zh 4ͥNEԏz P`g(A`C=h| \ޫ=^<{xTߞNU[!>Q.0 H H hXl@[PPB!9B*ҊQhqxRh&FEa%ƺX`Z_x.+ޛ )G94ZRK-ϑbi RЂsG.`9y2 _Gkڭk r!).A˛ʊ,JVS 09cH)! `Er3U ι$ҩ,k?x"Ծ'ڈ V9%{`K[)Cr`0P 0H9SS<Z`PP.YWχ%%:;Qvj_RT#1*9FG[ԞǪћihE4J7]+GnNm6S=PBB^FȔjHWl8A]Th["H-;VsSJ7JrI 368f2#jʕGdCJGՂ݄uRY ƽ*$B)a{T5+9{pcW3Ys|پtZ?[kh7dfxhڳMQZJd5^Dk֜ 0&=+Qb@/|#p9"V!PX)YMa!Fپ & s+]U-&/)h`7A a䐐teejc -R\ (w~PXiv=?5eB+K(-̝ƨ9b'E,bT2)ӹ{n-q& E @bT}}QjeU/d|{w4T ) m^mY0i%VRCXj4BN6iN nJ!qk'&WR! i\i$j‡Dq"CF;"R5Hl(R PV`\9̭!TBC^@6 m. ,BD`%ĕ1%T 7>g/ַ1!m?>i,Dž=Q9kTrd<}}EYcg{ UC ܯч| dY{UͿ_~n̏ө{XGQp\{t{WVmr?tcK *W!Au?2-MLvOݺC22uǚmC$qj$>v(,"sA"v_hf;9-%Ձ#1!NjE!&)9:\gԩ̌2gBZMظOir2VKXU;2 4_מ&O'GȚg{OC^ϭeϩ3)缳 bp0q+/oq p翡r=óCA~aG@?* Gi$]PPA^bq !R qB@,N-兲ظ?> ! !-M{_af7!- K&~u,xIr@43lǂj;wAp tKw"*FB/^"_t oK=(k2ڒ5JvoڽjƫݛڭvSh_-`Xje$p`X4₻]\#`qp؁~.!t,g ]btЩLgjkΨ1rΝa>k=8p 9nPIykn ɯu 9A^ߢGjw[۷ 31Na;[<# ,ϹPƙHwJ U^U˰w21J&>@<S# nAb!*Iw>B?d<1/0ȴZ!}[z[bp^:dP!dHKɝZr10۞n)=_*[D+pH3֝!bʡ-nysУKp]0iy z8ԥ #K G`O#>eOcbI9P C;Σ󭆆Y9`u >3G`zűxod.ZΞ侅^N0Cg(a"@%UTi`&'nP| f4wiG` Uv*E ѭ6r1VJzjV ltT"o2 d=J9gզȬcDF(3a-!Ż׭ 4WFr yn %(h D\~ڕ p dVJ#DKeIFAyjۚuPMVYj)VBM}F` Y%d&%! ?T-s3SZΎy40#5bf*\ 3n+, |6W]_J`I|-wRFo>vT{ ZA <0nVZNk}sq*|MDPS*u/֖@U vs^&]QZN H$J1oQT˕񪽷!A8='E;F(E7޾ҤWwqvܡ&|2˴}v!]"M.}$(5C L>>+~%30\2ނʯ"MpvXww=d];9@4.q.98A7┣YM1  { (:ƠC2i%R( ӽn=lpT2N;G_N5v>9K=TݦJ[,BwwT7.|닇{h/#Z2ؙv?ljYV+Dc🋫m?,t*#vu͕g<y-[}?6Gº2*f{!=ŪG;rQU9pJsǓU vI~%]%c!bA}2-q2@sp`N*];FsJY7s"ޡhN.Q2U)*8մp6φu@dgXS"yO8 W 갶i4wߡccaCbnC#j흁54BMzTug^&K޷7S<>FDYRLjP(Pw%'QԷlMW4 Q"mݛ7Pv7j>.rc?N*lEOD?؛[{hW43z{w6RN]~m3ңPkt6*ux+]{\_Êuuo/*QXaM.Z֞&H^|Yi{RkfʪȬ#O%v7z*G6$fr̕$-")t RP<û7uv,9^C̍~8QOI /ߗE]/5q+A@vA:JbW7 Ka-(: ܈tw=MW/::(+0`AVf q$d,7m2 i2Ӥ Uw ?ۥC6H Qo|Eނ 4hL8+~s0q9w{[ ks6cO5\`xS 1x-E-x7hP0'LpHϪC?[+) -A{)@C deCv>nqX dzMV5A`F$H Tx K6$B8&jbu܊3ku}#}kb0e3V@nޫ'v1S=CC#p0lJNW~ԩs#Go6=s!X~JQ >բ15/]r>_L?⻋,Р{oD F 0쬊ABSDΪ8*$ pmo`璆$WM/".Nn<#/;B \;m%Hu+*! vnV "<Ob rL`\xG?|2J"PG\sgVe;bB-.4Zۣ A3k?ƿ=/$T}t-.o:}\RǔzEy˻E7c~1fί& %+ݜ!؃MG?撏h޻`96|MLS?f U$+Q\yYwu Tʓ*?Uu hIa,Fho%"2 SPÀ uaCƩoMEJ_JUږa+xU*2ƉHaݎGkJi~⪍ xcW')1Ab SE)jXZh \2pFῇc` 1d:ȹ`SMT!cjA/(Evmv7Q* t9jy | + QY⨜o_ 9W>' U AŤwLlgG<(|vCdv0`rEH2oc8/Bw~.#\. g$6j2U@ i9cvj+e6J/6M2Uw} 3r*M09CDqdՒXy 2YϔfzǓ\Re=/XvFc$Uf86Yu::Δ#2fNPy0WC!`ځ\1B]5+eJ%CU]p,q$7Pa`ʌ6eDȎo>g@Ng\01oX1rk|vB5·.@mTrJ,S) !.xS/ij5Rb$)SS.` J9u8p$[hF]QRy:neY `R0UiÝi 0&+>>Pj}.(zR\Rf@5Z\S>#fLӾ6A]DH"\H:D:H\F*)cqҤ4Y([ \,QJ ,i*fֆBkH7mQ馍D.\hB-$&΍qVc8!|,\epк,5# ZyRe:IoZ 1YH;`ĚZX`aWt % t?]?PßS/$s]%eJsɡ?{^*F.Si0쒇6ӭ!׉˪^쫋٦bmxVm]1yEebK\<0Fs56B.o[|yVwA (T#E"lH8G8/^lT{#AgJBfzIx"\%N!r>De#jw {d?<}%_fƐ֑hk|dQ=ב ŏ+\r&$-[jQ{E6lJwhص7U=4ѩ˄o!賝eu^WX Cbi)`YHj \TB:ۻE C!{0}r-1 nIM%NI=bF pDA5$qu$ #FA7ʐPDdec{W\␯֏4Sr)Ï- H2.Cðzj=V8"6m{ZcGf6s[T ^"qfsqB+LR4uti&wX[1-2*d;;*YSC{d('.ZQ\\}# *{fy0v^[5%Ҏ6бC ǩgNjzcmHuJ2mR&qeƵ VqK8:1LbĤi{!yWgMv1݃<q< gZhh-Tȡi[".l$V}Xf{gPPA[׊xʍWB`߰Z:p<$6љ2DЃ2$ZCuj斶XGXL $ 'onqhDOFy28^6rOBg7­[ܳ65.pYίnʒ@jclQ[AUƱ~&o> '  )~5 6jH^>Ԑk0Ad?} TZM~hC>4 ;ې߱9.Sta=S01evr>$釙Ƴwv^Yv*v(* f:bq_ A(kqtƄ[nq0bL# 1)%Q=CZw t@QHͷ$;v_ stэ1{fzQVJ3hcȐsMG9T3 iT7hun܆Ɗ:7XBVi}8::]z3,mXݧ:`B7ëJywg2[J@˽JMVwVҷ^Ccј) :u/nx(tX4wI IWB3L45TWí%aᔐqx9`˹mC\05DZ(8;\7ΊW Bñgo {Z֝l#nV g}ꮛ3BzIo<$ƪ6BL$:hSp9:A&,\'y5u"*|Iĺ,dk ӶOe<&\p]z!Zz&RR>n]Ց=ТP/+|yVzT r{Z21w ]Um:nݴ\83% gՁpN GDZÙ YbR4eZ*Bur"- nYB)8H#t^Nl)aRCr[!8M.䊜}33_燩QH:t^>y+ fԩ(-YZ4b#,Dk]o"v1tȣaa=ٺ(iBq=ܦdh8魖|=$U!%eNZR]ʕ5VRX}٣$ATR7,IU?LYS a7&mԖ0Rfz fE!~c{\)_h[xLru8e(S'|hCN}A/&T>m [| nA>8[td#u)֍\FHIM%S뉫䪳Af@QrQV7M05D0MEksgٗ=WmQ@{-TqD+2Ƃ+Rv/\yAzE8Sc"04#j8:2r&fEZ98q"J+ϱ VO+MӞkrb:I&*Or:zPG=^i(^8zE{sIPH(hTNj88,.~!0QJz1W~6ڷ,t3DoxOKu6 з7<[M08 HMHjwQ"UJ |?O']m5ZK:gWexz$[a e EfQ$NL6kJp]bZw] <H^ySlzi(4WOv6M&'aJԝھhy!V*m:7R0bz,}F_vDShѶ2~X,/iHp yM|4oEjr~wGE_c1P:#ňňWopۇWZy:"NAYcYuQ7wO+oϴu~N$ gR q\ P{ɥQ q\$ohuGqk ^y+sFp"s*HKYK>i$a0B"G_~LW4 XZ. kCeIrZv Wwqv9/5?ӗu_4. )[C][n cCX.U ZO Gi8$s68|$#P *{ҙAh18|™vB&m8g*E[TR+A qW$ o x]kI%%:!tG\OʢE9)rRIy^L$p{ 90ZFfu'CCr^Rt^A hCŝy9ivHm P#N@;nHT{^}kS.%͖̿tU2t|qEY~|^,0DʐΕͻ[Y[%_pww_~K\n> ovYe{G!& ! Bwgn狵n9OizSΕ/~X|YǗ u TY|{w$;UF>nӻnpܽhypEnT2諸r,]f oD,_cJJ 5EAܼ4 {.>k6"^۫9iΛ T9or4BZ01Av Mce^wĢw=eǣ7Ewu)*TC3q}48AS\0#0r64T PK,J*XJ*gpȣ6D=RQ3"_kM=_Fq*ԭjD7zk-eQwP5P֜)p{PoGg0c3U7dk~ r(0~B'3b»c2ʠx۴g1i`.).{aEPZtSg[)f0]iZ%z*~lZ9{u7}Ϋgm՛MJu[/[ULD W}cx|)i}/1O"yK"sL>8a<*sLSE$E Ԏ/Pѧ$$֌SLI,z['ByIM BtՁFp`'i(%s-j19Cǒ;k. 1cvJYeZg5蜆}6ԒiRF9R@L \@ #]3%]3 LYQwOYvV6z>8UnKm4SA?j{4=%hWL >,䙛hkh7*^((О{; %qi4}5D^F 'wY>^e[S>Ѯ0Ua**WF#+zF22JGIS(~=pOˑ{|Nzj*lls,K`0zRIJA޼{HR%&lgn bvQB)d:oV4V2:<MihH~J]sBU 0Z]}Ls'S έXWF2V8+j?"oB*UPaBs!M$%ĨWrA5ZUDaӠ!F E(:;b臥H! {LZ xJhԛxPط,B$qRIty=ey&G&=1w-N^4C +ڟ z28zY_"@VpȊM)6-?i|'Ȕ@32s0rвqir'U~Zoޗ\  3/ 3%BB%'"¬σDW#"lH {Mb*=$ `+}]-VXw/w; Oͼd<#@91eEU7o܀.} "k 8 xuvm'U/sqƐ%HLA95h"ri-3h)%7_(p`IkLϢkU* oݬFcgHH͂PL>ICO=G9$FH\*##qvY$y:5k($ Hm<|`5zt=o64=Ou7ЖBiib1u7LG*QOc# yr"LI)CECt"`#pT; LQ8±oA؃fsY(m6?;eȣB{"9nGo )OO6$/@P⣉)K$%**uE +u5Rjp9dUZ(Ogd!2&4Q)Ks+xahD+#h]RKĔBڸnG #G 'ns*8-*A〺ܓ0@]2+bˠziXA=o7?PüF:sڷCP:gP#5z#0srڬ6gáPv–=7 ux>vt#꡵bO4j_qأC'MR'ݘ 覗'd'P}<+`T1Ңz<>\az35M'@f i;NV0 ⅎduxF ';+uZe!xy.BMMI'д݌LW rL=x Aͻu~\ևR#'~DNs،'(47'oX!Čt@z`φ/K1>eІO{enbGA?VxADҁE21WD /R8V&g&z|$u[2)*97J0a_Zw _mGA̖sf|~u毅ELh0O _QEiU__^~)∊.Dyq("E:Kf~-%ܯɟgoTeYJ޳):9{)n'T_7oopcq|9µ`=27͒IRjF²RHD91CB'j{턛&ڐ=h)ox+I)"RDbr:m,U| +(u*P+ 1 8 LrPh<.Q"%1+QC˸L_/=cGYΕ`kjY[ÏݏK2oU@ˏ~k|cprU7p鸝/ְxwuKכ3˟!?\u|Y#Ǘ aGSl=; R I퇰U0.]8ӒaD$By\DÛj2X0(e\7+eU*8ǹ{7(LpNs":2ˬqD;;d: yXp4Yp 4ǃ9^ D}l[Dlhc/\J"6J沁qEA7!;Vtuֹwl2e3J(Px/:D %0p`TdSH0 )_H\R'o.9T)p9!L"l@Wo Hd! L=dZ94K/2VNJh[3ȡȉژJ4S=p."j7& ǷE8F[S -<貈8g!(Ľ5Y*yIx0N$ɅI[#GY`c( ߟAÐ>J1:E0 3# U)QF%'iV2uH.RCesd8- LoW rZvG'DJQ]} ]}gnmJb`>z076&.iZ;ҙJA C Ysm~;-iG4'Ru&pvL}ew 6(#i !J|A>&@^A-qW^P)H^sEZ2-[dOغ eiQ96p3PkARm:lpXLd4=砧T-0QE^ }(Jt66рBK!\ÛOxBHЌ  {%c]Bv )؁}qH_V^Pǽ;(E.JwpQu2%\ /5"D" R)) [^*c"Guj !D$dQ2AIe9%"~&!$mTS#-lxn{ n]nJ{YʊCF8+RBTV4 3ӭ [)^!tڨl{.jʺ\v2ΉR/#HL͵B2&hl@jH>Wj+3˔cxͮ%.S}; osg-uqN:t!\*΄+ms?_"˗,, SLA*lʔp[K"ڕը޻}wmMnrg# WCMmUr\ ]?( (*Җw2l|_ht7Hq%cYѼOsπ->/aF+z%uB!kI'eͰX > 拤[k>)"꼜fDn\{ج Zec, \4,{(SGqc҆K;T#!cX̆XLD6}y պ/bnS_$D?jL3 Wsmr2A/{_h,C|E^+lvo+զHۛ,2Ao_͸{%|Hj 2_m(GNuŽ$m؍.giWZǎXf+|{LC(wnM\b9 vpwf1⼒#NJAA=sf;G'Jj#ge q+ZPVZPhkxe_Fd/o?lnyx(F#5-Yoc%}MYpf4L :~b&uق9đ{%OÙʔ 䵊ZN3[ ;w^^ha` ӺP'?,Sԣٖ: &MH>t]4aipVnneˤ<m{iZc2.,i6r<; /nF$aMMyd㛎VysNc[}wf/.. mӷGi9 /uC- kzlG[\W\+tX N5-Pܷ _klҷsQ<m7MlOvl盅r?%@yEnjVvMQ[ +Җ{qJD2D2HؚbssmNg 9Rv6Qβ״Y qF;gR2jh\PTx^`6Sx ةo*K~77~b>/XmEh1g*Hڜyh/'$ڐ++.yLM{@4oќ8.x!/yWj!?lz;v;Őr@o[M/;_7}~CAipmZ8D3Zr6fvQ'IY' ]*@ ]>I]qUf%;a|<+@3޹Br0td!UhpnMEw1 әDU8_EnQ˙#ߝ]%#z'9TCjN(XkU hdž3 9E[NnoJbK%i@t[#u JQ ]L3%ӽHJ{索N#GI9D&M1!fhzR.*-J}B7Nk.j -?W UJnzK.G$ƒRsA J!䗎*:`V9ڰY)S5y6WY$ͭs]]zuqiǐI-*VǞQZbuO{r0"Ӷ Kq#'jo?R8c'jɷseȅt} C"QT1 =VF1I #!83)-@j& V D:TZ}Zuv:;❍W;C\uv -՜NkiB-P DPɐS(%4PՑgSEw,YǷ?<@iߓ¯eC~Ŋu1޼y"Eo 憍4f5۟Ү!,5k!ӌ߹(Ap=Lm3&jO19~;jقXC sΟnEw܋"4?8MKe6>s?mC>MH%zmoM\ҿc bk0\Kdyjpmlr5˔;`.jOJi{4pKez{Cr*ZGF]nu1pQ(c%(mjպ5!WuJ I ?ͧc.>N,$P@qn֒ ALVp3Teqy@?c<Ő$hO-X/lNY{`WpR&dg0@D3=MT/⊙TZAnK߯$ג,NDyQ"Q3R].{BþF}^ETsTmy0˴|@g Tmi9 vv1CR>~!{HmNB)8M6&@VKS5\ Aću$0VF,Ǚa,h 7 cp'w# ud#1QL*1ЂF4y"p#I==4!.OϜ 6UoLXd mbko;rr8DIO澰m&O GU]tieoE{o{h3}sETߢ/h3 aI?^~Ygϼ(D ApVTPGvsB244 J1'F af;R\R^BIg9B/U/jk}*Bu l~D#ϸTz$~[WƱue"1x_ '6No\w5w/ ~IU%DܽX2,;ZrucMSW[6;C eq\"qکD+]+@=V J8*iQfKHWKR -XąDc3W\Sڥ$BG]!BMg]Z~֦92 7עHpؙZK(QBy-aLB ^ruK"BߵNT\CRXDT?r=KUz7j% s-XFI٫NA?:= F3TGRa\btA"T|LUwVŷQ| 6n;4Ra @i ̐&W>%y l,FDLiĥh恖e#&gm p`WdrAQEgIϔQBL,DfWvI1Rt%;E "Q$ 1 IAd`u(#N#AAS[ ٖ&bb!Mt/MF$>h#*FCaBHPaPPq"j "G7H߃:ciTQV{j=F̹V828<pCKY% Rǂ3@P(UBf|eQx)7le6b'_szkCޥUZ]R ?8oۇ3#o O?y쾏jgݯ;0}MXbVx4yA_3cXcm%]?3]ca5뭀9Y"V]~tw$ !'~Z_9rbMaX@=5Ce8HBKDjč(dJ$i5n6 KB jD0ZGJ24:PB &P1ܽO;zx~{)Ofnxlq_5ݏGq2sy8}]5?O14 TرnV]=3('y ;Y$?A] *VK=OqÔ FхKvհd_cb {krPK(EWJ2ZoB2։̑)R(@t͛wk@mEPҮ"HGfiiPFސWp όqQ8Nlh J\g , gzH_!MWթՀ7s`I^20jkYD)$O0")W,afש;uj;AxjG_MR!\PF߿*2qG.SϷvbʜ㋟inoҗ9ij}`Q_VphGO5PSMU`}|oɏ>$p/"jr'!ѵ^c/ߪ<{xZӴ%ƾEÇvo0KvʈO)p- r -ljEWTrL (8Z$653Q@YhcGѴH'luW,yDs&̓F|\f#Un.x{ U{[w8rtzTCxDC5'.9yܞ(if)oYzF$q-\4%v8s!C@2| KΜ^yH f^ь ,+ ܞ#&{> /p-9 ɒ֬@[u~> aEZ6KgY{޾'DŽf Y_ yUv%u/ɆU\\鱳is0 u1op̂T :V|vYDBhbI ~+ ΁h@K)/-5Nt0qt"qLK-+֠&*'u I?Bt # 0ShGguq J 1غ+dF9+*C ʫXFM9BI@3jpU,l+'} '\-y`T%8ϥua=>!Zrť}0Y: Yˎµx2ڰ!܍gqc%]lYѣ$yRȫY:}"Ia*.3fa܈3w/uYU&0ZаS4\̯N0or%0D<2ھ[àwu_'gFڊ_y kB xW..Y+o ;, .<0`\PDJ$('q3O#FoAJRbC*k瞁欝ylpq}Bk|q8+jI-S&Et|DaV@@vRltQN)i]ƹHL7mIVzq[jzZҬpTNnG+QZcZaTZ\ՠɍ7{Gs7"sd3?L(y]h?޼ˇK~_4mc ]zyůp0%N NK?׊SaJDn?-'Ŝ|Ԟ**+T.^[ҐMtZ%r3ܮuB 5 GuQǺ]nՂ*zn;kА[:FSĕ`y'ǂ/sp Px;IJ)+(3meH`L(eB*\ K @ R*qFԂ>]oUb1CN#} 3cLi| Ƕ+<phhm*RRST;kK4!p_^ UVqGTPXy#R,N)Nc(Y5-BP%0n4 ;k4 ZDa]o ;c-Rk#1CA |2 T-g2"vJ'Z(Y@UpMt j Rvq `w3⪍Aa)  7b?CѶ\"i헪1S 6MI?:m-j&4gw9 !z?"5}/58K9,3cc B0mÇSGxt -N'ܮ n&Wd-TثsJPKnҗyM|F|zrTyZgx%~Oq>$گ{WcqP#牿ws TԬ26Gwmũhl4Uܸࠩ6[J|7V߃EO1 Q-䒦UB;.xֺ.qjxՃ~u+#pB?8\waw.X^y&^Nbr썏l 7CϑKkImH@inE|m$$S ;]Xf_W|M )@B} ΗWͿ5[\JW dcz9D[:G8_z P6t¡񺉧܊*[Aפ 6P4ũ 8J =8JZQֱnbYJ&BE6^;Ήn\Y9T+/}yKSvf^Ô'r12,]`%%zri(S_l@KGG0?9/oocwZvY&k{3 ǧoBY /X ,]Zz]cIe޿y ;F\)|RTB+k579n4dMlrQZ@D;P*:kH˘S֍G]!PONPsh08)cH5@1+u&Z4N (ݎLfLꆵb5Rjct9p`ձAt9!% Fhp^`wagՐZyR[H+($%gkl$蒨ȣ+=TF̔ epԌ>%Q3ݮZPcD[ 5h eis=Pv<3Ƈ1Kډ/n<4ՃOw1ӳl׶A\5ae2C5:fR[ wM g=C:$2C;*"@Tgu2@\>>˧WbYYR 5ٜT+7LJ'[wsnwzH<Y}dgz0}k@ 8p~rDʘHuJA^ŝÚal`;CT;k&BcI5[:z>0wu۸w:'ɳ9zp:5MGy/_¶ 1?n m׼hXE# M~^z(8D]jQRג]Sn._G^9d@_KlmS)rKtAf͍"cLJ 63+`nw?jc\@N2H(T:?,N3sf9 qo̚$Lu6:zQ4}E5 ~]Ξ.>җ\8.7W{6kR0O2C3Hj.P^} 0GwbPiä6s TsB,tIR2xSGdVd4;DKFC V t_!PN [ᜣ JL%nR1Wbb LVXS(v?l""HW@)N}ٻ޸#WQߪ/S 6y}ǖ-Fc翧{88&94 2E֭9%}RQ305Pۄ]eQ_9Hh!;':ݕDKX9s״⎉3ʑ}aGj^7ځyw"[1va]%x} p#o-!MCb?p@{De@clŤs1>T61A]uN47HGjRp%Kѯq+E6$^Rw=xIj !>֐zubft.;*6O{㠊bdF8lv89Gn:fpxL:) vFѝ`6Zf|8(&ucbAREqFK6r81Kq5wNJJ0ml$lbT"QzΞ&zf5낋5/%!n<%,Q,6p.tp^EyTSҁБk;Dqg@ I8ukj=JR$RIN$IXh)Ps"z!:PR'[@mI  *Ľ}՟j@~T3|,H02:#44 TdB A"L E5FY[`ʻt{g2x^jPTqfo+* A]ێ<5Ez)gan >|G-ab4 V ł)&YLSgW0_1KLۋPBz%H|.BgwKDL()ݞ8؝|  E I~jQm> /渐0uw7s|նr:O7(7;ʞP?=1N+^=it}0T0FA>i2pΟ|G<ͥ&4\y1w%1{^%2u4{O-^-^,8ZdM-7e2 yG{~uݾ֋64;׺o;S6kIuҟcd7\zoU"§DM*].o[ޜ4x\ \Ũ2\TQ/U5{wxHZŜW͈kCQ)J +Tc~y6ѻ ?Q,#! %d b:ƪ?Y<|uCuBc0Z8 F.= 4"%cbe2m-(|c^lˁruu&oW2W;yMww7m?WgWۿz}|uL /V_+&~\Pſdo I6iÛ؛U3ҍ0I-0ACv$K:@*Rsv|H6A+S+_X~rgi\WdQnOHͅ^75m\@ܽvBX.(9FЮV6O lpmΝ d!/DlJӻM6.Jec:ݎ{%6&ʋ-3MȦ>n(1V*)FvSOʻO2MM vo[dN:Y볫w?;̓g)mԱ(Z߾}s~4O;ͷ[c< wBG9>j>,]Ux}mbSr{H^k_QP ~ɥ,9]a)),QhLP)Yأ:r{/Qdn$F8RM=px?σ 8۟ȤW,0RYFcxE ^VVi"= 9~#یE!#f)`"ɏ  ίRRԁo<u%-vZmɅ 䯺3v|P꧸se|I0$'M'y5?wHG^I5hjM/Tkt6Bؿ;|LM2744M ƃ 2iH ^aiRp0cHՄc|*x~}}v2R|vz^mr Vf/o69wZ⏷CfZ\0T אtZy[ppBS-9E(8+@$X2 tx$ }WԲ^Y{X0=? wJb[$G[*Ֆ9C/!0EVX8.+ k^E :@r Vc4f2e $ČGTd>lls1>T(4kQP`]J{f ,OpB|O(F U #-RcC3 4=KĦ)CMR|S-)Xb<Z'V?=V1Z "ӒWM?==c3.#fbbcY^x6C7 rrpؼICQ1)fEB_)1XJN ! SkBR`< d@0\%< \ QPA^wБm=_!jGK+aN2m;T5ХwMN`Y]}xaM?xЦƎֺJIٯq B.!r޼ *xeK?[ ތNe kOyO"ןS)~/걝5ʕIf̫?Wj3+9CAMu6.}/"[QN/'#ݔJw?5E7/V7Eٛ3k%o);ll `'zLdRJ1T3[yo:Q]y2GrQt z螴Ӣr:-_wd5+4-AcΓZV*DMutmJ h$mvQTƱ$hӓQAVi;:8]ߧ-ol-*JٜyNfN8>زlSHf#p"3g v<=Q* Im f?{,^: )*d dPwiRZ)*IlZx888ptGt"1iQ>72hÕ?][o\9r+`ObFN$6W[XJw[lV-\% dd.<Ū 6xʚ΋=Ag L gl"WZq+}g8׻n1gPf-F&ɶ)̈́[`#ݿvHuZ#49=6^OܛbYu_cvG$E1-{w-ꈬՠTHRV;ekM=/7#*rwEn2:YO~*'Zy`lPOUX=0P>}o :"yy 6å;O~ޞX֍h{V|('g_ο^{J2WWoIh[)߲u@*}bh?^ lj gugof1y-{kf&hO/x;vD;Yhnsʥj`PA⠆ꂭم`դ\˯'dC4sd~H'/$vAKZP yn u'twTQ5SrM(Pc KIy>Ҝal_\훿/?nsɯ>Sf cY2n~.VmÁa]D>tg{gb%Ю5Dl&kߑRxWwqz&)uqbnN.rz~.߽uXC~~n?t-7q[k;-=?z|ξۭM#Uу(%^zkjdzSJ+ɑ(˕:1HM1(H>h ~焠]mp6UG cL1W z2}Y!٢.#PKBF] =f7cIlᓙz6Qzs Jr0R{^~?Nz-(>|ۧ c0[S6{dHV/D,w4sޱ?@JݡWfw*\Hgܢ4pkQ*XzVW Oξ ejބA)@(h0=+#!XQ !Nc9G]?ďHu7RA?_810@l&" 8b\]/>9Pdcwp4fecr3'@@Duz@@ ~y Rl.qÐ|9>Zc%75 8t0x+b(-X7rtOז?l=?rYX~!ϐ=tɩQ}.dG~C"PxQ"tpFӭIw#맏.0j W-gW]ڑ#'}%k999g3(Lpp` i6n󍪂pfOcPlbe"ULP|K5!@4nqjn%%V sH*Z.Gn&r bM4!H.OGM6BH{ ֧BTS3$zb#RTgF059~b9Q=Av(XR#%F8)-5v}YhCo&ڨz &cf+z(cڨx:Rvy;5!%Q;I7˷"I"6K(@(S!;ew,lz eIp#dg۔ ![~>8$|\d:4,2>o,0O5jqh3J@ H[Yuz76n[,=#ASK8{> 6 .D g9[9 c6h8aPWeW;yŸzS[;XZ2eBй`kI8C\ YЋeJFHT[o%`;;AI6+*Gu1.u j; ₏"L ,*2dZZ9*J;!C1xjb]DSTepVMG-yQ{j8HVnmŔD!Fb%,gl c0L1` NzM*(GZYUIF9BEmulSm//f_D|'6Ωboe 6+:R1DW? natw^;팞&bXlʹ=.If-ދ>{ fEqjNOŅHRmҶJ]bERFwȎ5f Yd*Ԭg&!5 K;U$<1څ%LTJG%PtHc4, Z08j h@wXZZyk|[oqvbcQ,o#3Լų2 |2*EQsX߄yϒygZo\8ksS7#>ҸƑ~c5N7>wXd\(jCP8n 3m$hãF(ceBy}~gLtƂSh~a䑋;';DZپΥ 1pKM/G.@tNGMhչՓQ[6=1M|1 3qy[gz{=ԼQ)A:ho16K^ԋ'7?*WnK.SE)+;hE$ߧRW-x{WǍ/ܢ, >[p퀀cYR$I|nf$r{%X]o,|(+}f?덱c߱0C<^W],ƟZ{BW~e 7IU+|*zE|x9>ټ@& T>)BscSůA#z21hnGps06WݦDG{7+XԘwx#nyH9 m MtælMҏ Pk~P3̃!)YtghO/.noݛ]eFF5(Ăllȶ~}jmO͑6MV @s]pRy+BTy@!2!'l;;=X͈d hBCދQ/x#*FصX0oꉙtb 뀖׌ySir `ͷ7?F<{y,Ntj.5 y!LI.B$F#(nQ|i.(yDARyU@8($}" Z 1\cP:2,"D`W3^G Pt)^~b<_3>ݥ~ۍc*q>v,>Z!Vkߙ:KP9gYsg$ bݲ탬?Ґb]nyEaLLW\:Zj"ުGqS ۵]ݏhn;7glc?:^ڟ3Vge[BQPS5zY ~ח/|uߘye};=;=ջO4LMݲ9h&xQz;3~C6 ]Jd;}w9߉!+54k\5lk=C&}KͬU){;s5+SskJ[G]j48h]]`o.8uUMz {3*N*eϨEF柗o&^9-H?]Pfj:0tK?wW7}m[T~ (W)V/iҬJYyo<"Q/.n5?iض񊚹 2,S`ɱAoŻuPO7=T&`Yկ7_V':q[g1b҇~}jdOJ:Ji}=/>>:19P}[ͭ_ Roov9GK7cHL261!ț:HcXi`)FcgP6?1DNSI,5ucvwF2P9Yx6f #eN*FQ飍ykfRH;ڎ K%tbxakFyŶ[z肒С׶A' Ne#L<0%&f>*n4Y.&ԨEXEQ?|] Y#8;35oR;ϟPBH=ˊ?'J 15hځQٹK2@gZ1R@a. e(]_W˧.JUqΓDxv0ʰqUzP)[l]e,~VC  2.l]y,3@m +D!)a8$&kJ(!}v4=PV16^EQQI 5S֚"zeD.700)ט md4GAܟjhHg Tِ9D se: l(Yr@̹u/e-J" RI%2lw'ʌx0l):$ɕS- ۏj@Y5eR3rzFl͉>GVҫd{뢅v-ʡО%0A8$чWzɰ11@ 7DŰIuyMEd򼩧VS&ige\Q.J&:YRT&J)Z):+^vҘY{UXP_++Vi#"9gI5+*+-'NI{:uvkq%ipfIbJV2/W~Lw˄;z-2l-.8MZσsߛ ğ%1"P0Qr` _ZKݟ@kpu7oK|%&cMՠ9Ufos)+鉴4͵=p?Bvm9=:x0hzc0*XBS)BF)J) 1lw"%+pcԶ D?we޿_ͽ;@Xs Y'<ݘ\K{NOD]i Bٙ #+7Z^yq+UҮ@-r+F%Tn"3~|UQ%F\w!r\v> ut;%%kZ,YAwJUR,^rn}f`ZHٓ5< |bΧ(@LeYOKZe=T:BdG.|Z.řf8s2yMJ`f1LE&*"`V0y&xMט4 0$!a뽴\mB*oUlѰtZdJ[3a1y2'/;cGǜ\j:\?V;夢GT$_(̌R:E(3iOi>lb%&lK#\*H̄ "c-`e$C6:BT&%e:ЪT:!q+5ɣ{MRZ>K:rjTKB)IPV*yPUcJBzN~rގ=*.i=*h c6czU mpmIRbhZn-MH&{%7@cI쓶Z1M+C\ʋ=J©5Ӟ[2}"MeOeeB靬Zd?iܪ\fi⮝ށm6vmП^IETRfŐki%JӄMX'HEtJ:eAMm%cvń6J`(o{Zh HvkX(;بm^L_i"EX=j]\qVO0[fõ5S4A)}(pX'?}8 nծQ(>&G7ɀ5xpdkԀlm O)% Lz.A+9`ѣÜxie,WSԫ$>?|K_N nLEqm*,bWڑ`sl4 z{LIPpL\k3ڥ@iV.L|ݚ1>q_.7_ֱ^D2/>^|w@+\I8gXSYi$R1zb_^n_Unr N̼V #/^vVFLDKH|&`"G{BI;Q, 兠;kb H#)Qwȳ4kHي8[X gTߥrQ૴ m SŸ⬚OquQ^p uQvyM=OlWeGpprSҪA`u*E7on`JY4B--)Pp =骦J $3&X  CpA̵vYG1/^VHV OY$@\ QvkoQ2#x8Cra20QY_/]ct逰iwFKŻ2cҁb 6,K!h]ٝQL ke;#/8uYiS}f_.3ԷmFٰrl/Qn壾i/M~ &T_:%谪ؘ'e B$De!Ptj4ϭ*J MSC.(5%@T(E΂kG "z̊=T6<4pM:[oY᳗1`u$ߤyؾncPqœVahJi}`Ӛ3<|zq;+ $FZizNgQC EĠ )I;zeϹM`[>?{(gdQl  -ۄڀZO6dJ]!]!SN4Ґkq#Gnj:c_@cMnN?>5ZpĆQWYZq2j L@f*'1+ FRLCDt(0oHpTVFoNN?u*Trj5hCּd3h>܆D [ltfsȚ=z do8/;!Knc:3;!O6cgd'䈍*^W&okZ8z)+O+ u@Q`\)#Y9 -#r52cNel޺kn 4MM9 ܔ}U(9ver eC/Ն@ Kwf'V(kma:$N̾(!'S^sYسĈFujR=cA "œ[5 H&LrUnMSs3+mN98):i! TN(:9FפD$yud:؆:QpJ5cStnZZ5bepŤ@"O9\:BKE{\ XKCDiIG_>x~:Q2[z:ܥJ?_sO~Htgedfvtz ygƣf8ϜfC&#b0ɒS_[9_F|b러$p\fM TE|_)eNSy č#õinEx0z Zڐ$ڢit[?mkJ-ysKIķT@UWk3OѺiS൩K*.="R{%J h(<`M)!8Զ. G`18 *JPEjAυj`IBtK_toKPO-jPҲ?ޫQP Ez+4%Фע4zG9b%3 >HD9{ِYNͭQd3dSfrvBJL~ /䅤SC;1%"Jk؈LKs=`?(SŸh:,d΀ 2DGӒvr"Q~4-34M9$ǯo7j9xQJWis{7QMF1w7׫h3ECv3xMj&,o>]i)8$&2fnX;` d" G'~XZǵ*$&*WX*B@+[*dů>DRd:xs$Nj[gz*%FM.Y=ƝǾFu&D+~"yH3ȸ7*|@?j}o6 |O}孢lYv–H噖TP:t#A -Id5KF{xXx/sAj[rU Ȅ~yIL {&>'fR{I9>x/d,KXS) sX=,]x҄B4aJ/xoC$>fCqcNe4=`8{f&ˁVea\g Cm9:#CEi!)vˑl-@6kܶ9 L#/'s׉1gnܷH#w]V wEe677#0AIj`|]t#TRdUJSY. W+ P7xj4ɸԺyTt>#ٵүUF ڡZG Ά\ZBP%UQ "xOuX" 4fٓU}mA:!ށ(_ ; 14ceZXJ@.<f\5Jj[M/xu\R.^v=Rn h{KWI [ZY> w?_erK;׏O¢#|cPlYs'9*nBzn>>̆~6(MN¯WVHN[뫺fz~ QS Q04DΫVqGo%9C竣Yl~K&uDzxT NMTaWؠ;pZqS'q3Nj}KYp;}tfjV6M:bŀJr/E\v. &i^]l(6Q\ G*iИiИm|xAyY!y Xa]֖X:(A ,te4/6;,Ue+S3?l!x6WQ^/ ܆Z4~Ch}݇M0ܤ ɩ$[߸)R7~%U!5c3i˪%6ڣ =JtUyolP.J]kE]]MUT`d㌌ x{@%P߹H6Qv[՟EQX"( bl3/ \u-de0?VKT[]~b{ǝZ yNmp3Dځ5V^lMA28u;!F8F׈j9-.¨e>j{aOs{ ;#gش1J=ކgh\6+1gMڴ3RD_g>PJL>>vhb|T!تK)j4tk]Spdb&qw?E" ѻm߾d%БKX0g~!Ww~|  cxHO=2GaW]k8vck2.rEyOC~j5ҳj.1.+"~fsj'Mdž@ݬnMy7;ZOŕ =Jm-9X陋)`iaZV7? GF`45Ox4GNW{v3T9<`w~{^W(n@5wv?H_gɃfW^_nz}Ϗ?]M}|tb@c5EӓxXa"\~i. ogX5n/om,p[}>mbVİG N)ppV>r"ikGg{yW}s7{{@Չh+j~;ZȏڶIFU]HxaKޮ/3}IIG{8&[(ou0-M^{Jm~狫̀~ pvAw, //Ido7.J{>Lu1wǑ0#x@+65_Aj#y.S&?=nQ}~Ot;c6#̴GK[]4ȧ\fϢA0~Ot;, B{n[]tǧO1gg+WK:1N:0g7,˛˯r:)._}i/߽xxHѱ`Ljv ۻnu Z1FLbv6@\F/^BDhK7I\k W9%9F<yD^{wM&I.H0ħtWF#.K d*{lhuZx*KD_F\8]t)"Nu>N$gq(=4q2˽Vj}m?d}}XsL Z8FeHjәurv1 `*~һ3d?ȉϛv3(/L{'ѿ ;HԬpHao?y}!}ITZ~hr/?;9}l/+}YAeO{Gαϸ}J 6VzNAkz#,DpxݎP7{|zwҞwIL{;޹xRn;h-W#16P͙j:$oa=;W?}cW/糕}aH\\ڵN~jטJ?OLsvo6ﴝE.mƂxcif}I} wI1l>j^SM Bl[g;W=W՗ldTH4'V?}|+i:[y-1f$?|Rn[h< (oWK1 gD!JWL(T r^6|y}\C4#+W ]<(nGD^k^W|kqDߞxT2F2v̉M')VNwsR1.17/hDmHV9&|29Cf7P$ކ4GUDX}q'i~iH?2=1cCW8Ը# w$>]\h|aaCvEj J!@/o?MH@ٽ]e-UƟVӃmyP(&Dy%JHQV k gPatΐVZ*W Dr,a .8%"4btLJ;.۰I{$<{F(Ëѷ|n~p֗2.4Vnv|.ϙv2֘ܭ'w=wB yqm0JA`dciyӓc)wlC8p-ϼ "ʻU܎\^8D"Tn%٦LpID!+ 4L2$*m/1)j \KALlɒ/]PƍӺQ^; 7لXph?N#8wP9>]0E:תXIaFY5˔2gDB7=n-{cŢ*s syT:NlM&I Qo8#n] %I8T6g~|v{,dD-'9(&ԇis 0GXSÄ8u R?~ӠZ޵0&AlTİO[&kH3-$4Cn$Q" O=0)1r cĘ` OƽR hwخwqe1HD 5]ZS 0  ([O)fLW-`%E!)TF-lD-Iwg_ڒهզ[v_e}uȼq3vzx_6]_< ;D̀R!Tg U<'7cNFymz05L胚&Q8>gqe @`px!P锁m NhtB@|=)AnYrDq;͘2͑a#8>CTH?A?Tib 8VLimGs+Qʤ]D Իz[TX}*ɕL:C-5(R:Wu0Guc"L_`IC'4rp8A2vݛev$<_'Nq]U}3i*ʞ#ʛS8 aa@%_F;)P_E4c)##"`2bs S/4Ab7bY=%$TXMEwl7ɸSm'ϯy^`18tF4Y^2)*X ̳Crcň= DiS:QT0OM}‰=(E+aKyak|z[aѤR.(Cz4GoOsyI}d'!(t"5'q"'9 hRHe3@!#!N"8jC9oDp$hı=d 䍤7;h;5sXwvoֹa.TE.p"7aU$8D˂2*PhΩҬ̑(h sfXTP(+Q^p QUV%VRqA* QU2JP@ hA Xۿ|m CbW9:z2+%eR,t3c !g#cFcR79K1gKR $DCp*5-/3`fT;:*R{\Ay7/pw&kO(A[Yݸ\7ڤl=nՊ;!(I깙ҬW5q8գS!}XW1aߏíO{3#upJFҤ"KNHd$*U@JJw]HNusmnhS8n}L6upLpFq;gsWl?zkh ~,ol|d;J+Jl=ă`X5ckp;ϭLX'ُЮ Ht]7 9yK39"" h󒘮1 YDD0+|:Pzl|sӱ~$+ \1\rK9+yhRk (`"pи$('wJOڤV1bWfόgI(L23.a3+,b7^nwȎX<G@=w(O多z?Ff8=3x<1 ƽ|Nf: JD+d ((5˵"4J qʌZTԸ ݚϴ١>ORYJuL<mryolJ4|j0 _ pf۫a'/PA$9}я2]~~џ\I>|t}^}}tuw;fG3BH%Jg6(LfH 0ss΋JVpTP +&f0eL]RVz5a5{5dAV#+T:aIkljD|GmnxJ+poM0IeG3PcU2123{ͧ_lZX-4-Yi;Dd&l%de={c}>eP._[Ճ?0sD/:A #;Ǘ3 C'|'i&CEȍy`!,P8;a\70MFR3 "/(}ynąJ,7D}ȈP [ >Pb Tsk&]?f]H]PDa9b3:Ci2DE61Zzv'dlcZ%A\:$ '(N $C='&I=<OyHRK#)o=71MF̗I9uBPJXGDMv8,Q =׉] b`H [ˉpaFif7jEe0Z`3TA:GZ< ny$vEa. (s @%g[e 19i!SeJ3eۅ!*G2+St*ѡUBin,&dXTNdʌc\%%Zgbцt?ŚU-Jff %DAɅuS2ʱ2gefpfE&Q5 ”q+E )">:i"7Ԉ& ;#˯7o_Η||1W_axo忛}1зŤ na!Xyx?J+`&\hE3f#ZrL}1ap6,& 22Bhyߥ:)1<he)ߵ7ScrH*ǵ) V=n &33E$'}7ysi 4G.ך9rVdTGˉ9}PҐwF\Ɲ$Sn5bgMoh/#?Up}_).r3| S˸ zxOXI-C&}DQ3 6\v4_OwoyeoIuntw`}]Fǃ=E.䭐l( T3Xo?yŲ@s:5*3UUrC UEQ[`ƁZiiܿ8QR*0g9c,9H+F9] 6zDŰǣB\H e.9rRI⸲`2 e ؋΀UT[rCpVґAX*rRa*[AҚj.5Vj,XE}kŀ㕙ٛ8'W-{˖u8Uɗw?]iN %c2~rx%籙[~iw];a{Lg jg(qf˵N0ć] t;>5oI5g$9lbp~!J_OiJOX|{DX VX[4n=1 ;H_b QN,*k n H 3#T\W-[4o)۪,C0E%wT %h gD5pƲM!\jC*'2$\}T 03g &H@AUŨpw4Ehd?]_8j:yL- GDT&,_c$pd[ΠF<;.aÓY~7|!/s;xxO#vY(.IVFrAS䛧]{w\7e⦉ kN%tI9W& '~JZ@"H3#fwr֥N@Sf)B~bRnPbN.sZwXJh"3SsLU*GӅq<)5%Mdקadûex BQ%1G  !,-!ʘuȘ{z2NFکM5È`u%^ _&a"`} =Js_?{.S%*@_xjM+gc,6`y(Yn'w!ξ ~o Bg\^4c-Ûuc)z_^i+p A SVAw(XU4?~b:CG1Ꜷ;Yg?YE LCU4E8bߩ׭C ZTN1֭C{TYt@@քl{=AjHqg]ȥBRo} 1ɰ6{ s/b1(Ij6΅Y(hNDHI,ԈJDuhqtD8uO.n.+e%GȟjDk}} /3 Yc,ˀCQ){| 8JU[1{gu٬uA rEŒXCǸB0gkQ7p{1ҒHX]hpϟn]h;BB~xpaF1p\E,1jAGe]\DwX*O4"KN s C2cF*`*s}жsa}i3B%j$EHFE'°x('@O5!юR }/42D+@\ejj;TK$ }7 0i6@~D%bGjy9`>9ZumuuB`uḴh=wL\r}8YF ^PѲp@#90:]q, Tz1"z \NC_Q?$*a,.Åa+)"Hd '=j[@URrTuH. 5 05p3gJo_gZvLBܧLYؖ) J,LYk%%=1xglZ5chb1rp98y[H Θ.@GW֥r>zשVն^JAȭ1b.FuwLv1*Hx1 y*Satc-!:FuH2b#[䉆Z&4䙫hN10Ѧuwiy:cDYB Sx{-u֭ y*Z_$&LJ# /m5x4ot%Kߏabz{#կOR@Erً)bXqiYR H1j$#n!e y6!Rb Nia;ʾؐ7쨈e(l 3-E/>T y Y9%4>T!OTD$DKsZ a-Oxj-YKOZKà0s~WDBK@YT{su`L~Ұ,˂j/=m-e8LKxx)"z iiI50-BKQ=^Z1z'%B.a.⍬D{PX{9եo';y^]|6:tH4X4Ϧpg YG?.-/qa@f%fp,ڥ_)*|ۏ t^TP ^ tzLu4-" t-[IBQKԙĦX8ez|Px^B?-"I-ĂfW g$Q(ObX*R ҽa]IlE ԹX4`.ҤDƀ5}ʭWޗ7ƫT*oTF WU7תT(g[(r=I4ip6˦7.y6]SlKx2$^OLTO=Ώ3DB{WNԻ+Byasrjxko}fGƿ@vAq?=6dֻ.jN@dfE+^Olѯ׾AD V?| b.'BL?ߗG^]xX7Wf:~`"onnψyUCׯA_&Df6K>n{$\1*: T-2sDV ն>5X ~:kD[ d, $4rh%֒TYkzr*e7{h?dGs¼ݸa+O-PvjX;QV߇0a#⵺!xJv [Wdc!Qq|vTvqŷ:>͟lTf 1ӁJs5f<'8jU09,9lD=nべ8ƶڽn4G0BUGPeyQZ(Yx'ǰ@Ah)ɇwG&_62 \`j0FU :f8Hxj0?C364C Vn!15hȭʆG909@oaeb?Bj6RIa6y,ŕ^].}T7]DOz7AZJdd/tug$RXJS6s-D/ޙ-Hĵ4IHa4-C)-UCSC?ʼn l\X QI*qFE/ۅdS2Amj^Nx^/ҌZ=zaTk܆DfyEJ"/rQbH EJ W)5 hWҪ[}Ηc?ލs8{`_X}2}LٽzWxx[ٚy`0әXۇ)hY;@ܰiʮ]iO I}Yd8É rrYS"=/K-8KƩb^V>/n&,8k; ݽF =mrͅ/]jt好ysfO㿛/&.>LLVo9퍐:\B_˽.? KfԪq'XN'XVOGt`i¨ԪezKm=֕kX=сD0ٵQ)yrt|ylczlvJRT$ a˝:?.`M9{kܤ$M "K„2S.2H^d$.kk"}khA.]Ήʹ7PDi`kߔr?|N9K~xg\I;%0\Jۀ6 iH<=*ʆWW}mǗl[e}P%Aw;=6PR!b[tWrTz%v}3Sj5vo% ]'^"f؝'B@rnt"1*K<F^ k,LUu(w ͨ6<>-MJR!e%6gʾ[H0~\9U]`-$ u-»&]B̌{)"5Fܮ:3"ӱD%#tf@D}D'S::V+eȾrӼ(7A&2yV#LJ&謷m)Tl-ck7*;ЦkV_-]PoO4Eo1NcO'6GC)`OUf(:}izLalB=7f$C *j SA߰%*R(eH }@$VuSjb}>A':A1$6xS:$C@I z2P:u a1-+ :+F :`aFݰEZl[$MN6Ӻ"M *p!!c87;#VY񉤜t-}t֧yF b)rDT %+(S Z<ϲg J)+g &&IrJ2f#(RHs((II 5XP?TF5),iVu"'!pW2d&v}li߂R~W V-]{ ίwK~H}fŞGY쩔ZEW?J-uMs`GHa|p~A/ eD|wPEvy*A]F||]FIcNRbW*\kќ4iytɭ:Ǜڹ' O(TOj萇bab9IT^{?!#E -dd뵗R-t}Y"CXdHʃ1,q~Bw((x'm?wJۇ   ֢l08L&.z©eTG?>:ZV֑ݸ.xMFs:'43/SX2No: j |0Va))P*t(Ӆ4y Yd`NOoOa V\>_?GF1ϷR6t=rym#F]I.P]ĶfO˲4Z@ykA=՘:5VA5ގ@".HD f V$(kf(b\] 2ԊoGLFAI2~]Tv`EW ڗvX޵8..V30;>51Ӽ'gGW& Br ΗrT^Sо3MQwU+)j 0B]o &DF);dvX K NȩW|>" LC:ǪMdJ9a-b6ɽBϬUBW%xز Ef>)zW^=btR5LMG_K#2H)f*TL—wg@ڦl1u8&OEp#t*w cg PV]sz#Te$p޵oNxv@{a =߽R߼ZV .^~ZJ[텖.P^g R*z~ZZJ-Ѓ^2]4gW2}SK_J-W -<-ϓ9a?,M]j0_RꧥN$D -=$M]jaƿp-Bu/R /UӡK-lek)\rǠR.-P:hi)z^Zp{V&ңm޼XbNR`)Ir$IQ2א$D lQVZ`nj\/v- w8}ĕrV\h'a1?{х>Iz: 7M ]g4'zg>~ٸ>bizPws"QxWNe?/þWVbtrլ)\ƺjf}Gt 25cQ'F,z;љ=$Exݱ[9<`w}<#v$Ieoqxڳa&!"_K+)}OoDt6MGsZC i[2&Q_vI%P-XEi"YE'U}sE\3o hszӏPCPDGA(I"}cDuy7*tBAN?Ys VNdf F"'HTAIIp\12%,Ky1@ҹb*-Nd:0 `ut*7]QYοxIvâEXǧJFdYZ4wgD|ܽ߮#ԆrI{{gVM3V{b1x=S^tHG#VL1Z2߾Ɋk*74h\B7U%dݺ5Z0LES,=Vbfw^u ;~ ++{Z3|(bp7zH~mջ+q\ʍ Cȝo wĂS$lqlcl96[ɫ~1Mgt=ΧkJBv}\8+Fl$j_5b]zt㕸}rk)i\6=NuaX4Q6ppH;]Ɲ!,ǣދY%vp~ϯs/1@7zQBݱG)8J74 iMN,y|oa@xav2c;l'p'@d;sҪ!6S8Þ%(LaV:FɈйf@wBYV0;vLV]4I hԌvLM"$qMIԑ7TJwCCJL쑚rʷHT &>OIkJ/ŧ Y?u=am^>O}yqug51zB07) o_n>5$1Ie RADLT!Z*˘T2M `dͫa@^>|CȴMu(1 TqH nȳBD)R "m@kJsxg_n s~Jf6gOլ %J gZNn +M\GcB/dwWb:_,4YC;!5T={BNdvH®Fda]q8\ NpIFkyr>+~-P++?e'RɹԦ&@?];+OB.3Y>Β̼sݧKxcX9l;_4}DJ͊6k(+F]58%llRiƬHX)ce d ce)N})C,]#l:{vZ#=ӹQV(RG3,:&޷@f$0LNC:&7 _utr=փ:]0'?셆i[d 9, >}> e%O%B>|r꬝q?Juyu[b6"?:N涫`l}Q}먾uTV^ O9SEN<5L\;,T*3À4fTcZ)\o>~khUл[~Fm,OK>47';rW!ܶfO˲4Z׎kJۅ Ąr:)ctjPѤ(9Pd)HC򔦉QyrP, A_!JvY܇ȅvDzF6J$O[ F6 UR|08QD )EXP5A)HТQ{%^\&XiŌl3gL LsqpGJ[8;O޵W{8u VKX,OFኩGyr;~[@ˀh`#A~^a\xI[oĺL.U{"OJ/}Ŕ7pMH5F ¼UOGWbRT:UkE3#T{?ź֪͒Z,JvQA̕Vڗ+`ighh%@Ef=. ðkC +Ef:#_ })e@j5zT]6=A"d%f1ӥFv6{{ݺ͏?mÜ.UJڒ+=*"P>WV,Z{GηmC;ǷYٻmWоY{yȇIz9'AxqXBexmײI3d["Yk7n-Sy!9uw;V'ñ%pYkjnćOmu^Br:}]nzp퉖1,CeV|*ZJ2SmZ7YR9Yb候&Y|\0S.\Rz=gԄ; BO IBIi i+2G״PI"B3! lK]UsIXs!;CipoaܛpDJB 2ju3Zޛ'w48Np`FŸMbw=ӻO%D6{s޽Ѝ|PɦGD6:l:Qd%l8 *J4NMds175:}^dWQpEiDw((mJNxf &</Z;^O֊j˄l8zUim(O.! "j'^W&`W!ͫ(`do1^ HY_WӄJxy@z 1.wJRe=rOt.nswnm=/~^m dzKۓ{f,%)Wa % n)ĥ('m'ū*m%w'')Q0IIZ^ALygmatE5LU6BQq{\eƄl0P44A8H® dnLǽny<4m`Xw%Յp:+7O{TFh'_8W_d[x6KAbTFt:MШُA{'fQ67KBw@/xfx5M$%P'[fˋecz>KB\]E2 yIy+/~a.0T kȢF>">=yep+\=ym$CgvVg [_3 );_!Z?6B5Xv˝'T7 IrK򄫭?FšFmPu3Uvg>Fe]vaNmNGfrk]}8B{uU.rrRjtR6jLj@ u^"x&F1d.rv1R#黔OO4ԡ,C;+czFԳkՠ~qclUu$'gÍX]^yDQ?P"ۣyÇB@SBkwQhGqS,^c(0Cy`JԂW=*x`KR%dyqy&&LcޫjډP(3rsrBQfNÀ, cRϹBE6Ƶ#c$i*~-TeRpϥ9= ֲ$k[ߙGDQP LÀFpaO}ÃH TxL ,.$zXh#·_\: >psfiXC;Z]̪+w<şs3p LhqۊDZ^so8sYwq<"e`JAֳΫa*Kҿ:=~>or~U8\H6X;x@v \\i/|vjtWo3IkO̟E׃lA? <t3ܛm51@oqpJb-G{M N+0 xÛP\|+w+f Ta-^;!v&z D̺k<'ip%D5`+LlR⇿T(w:.Т [!GT(q~>1Wc`UoD).j /D& on0roJX%‡]f \ҳ[x8=QMllT q oino_ ~|;܃eh¤^Fo&V8|<@S(x< {gkiĠ/_㿖oz,A'Ǥ{?d 0p8 ~_qV( ]wd~t]⋽LM^q<[ܔp|2y>Oׯi[p߀~O oOu >/&ߙN : ߸PĀwvq 5smĊ|?ݠ%j\jDG|j|m2nmz;rNt@fK)wfO 5aplN%'0y$,ZlLz}»γ?,Ҿs:QfxOh'23}>Ζ3ᑴԴ)N.:EH>4KA;!! =Y6֊c]ݩ@$[WZ#Z[*Qѩ-(&W`^!>9j8v*bŸ.Ց2,8#"dHwkqeR46AKXCgYEUAugY?α䍰S&?y4*"9Hƞ%)x`XR笒LlG7WByD5(It+: ||Rx~냜&fg5]h7Vo>!݀a+By,侲BE0? "|lPgp:$[?Ikikk/w޻m90\eǙ{ggvk2/uaޒ!pWygoTvݴTS0C: t䥛"IR"M[眴6U#Z"G'QY790Ή*1Ij"JJ/$$"A(<1WP U/C ||SʠބR-aaXoO7E]mzB6٦RS%Щn&^0Į6lB9 @,MBcAIcD2(M-!8G뢀H%[*Λ~2г}Β =o7}B5QT&fw%Xf0`.Gh?(R ,A]CBv D7!N2cZN=[|0)Z:Cklrr05ySy zn2EsP-lul qEfDtS^#̎29fzHC `BLk(KУ^(`XdձJA ҂S([΅m eVCԣCG 03\E/&uDP@ F0"D`9a=)iLd0OTP`{A?I| @FXj.="pč"'G tdFlwBJQZr`գkD 4lt0I*LC pdh=_PP"B,$% L<\EݶJ ]T ILeJ䧄p˔ gHcT3$/{ܶ Vu_\9rJݞ˝vyU5dR4 @$%1===3FzG)R j-J)Rm))xS@Qu/B Z-Jn+q$伏(&fRHZmqwA8HQʈ($"?4CajSA![Is%O'a7:% ޳-5$_j^ӊ+}lj@ =>$ yK'KUI2 vPZyL0DCOD`$mȉgu A]5Si\=%_kSL4[?puzwzwzwz/e kELYB:BBp1.B9!(PT!%ߪxK㏯U9yuLOcedR&RF!s+ܩH#1 009oilJ&m.jګ ൊiIByN= UzU(lJ@jB]9S!ew$B˵PE1eW0 *"0T+.#g驪NjF\ @xW A|3z`Z B^w:̀}LJdž:7r#Lp`+"# N M?LWx:ǧ^bo~{(4[cq(\غZ0p$@~<]B`ž} R' ژCޘC8H0ЪCa-kxltol>#d;PB0,q5ӯF N6ظҭ1Zps /wkwr妻|0*oI'(A:Ғ =I *w.d%4*QXPl[ez_R\%Ia+n-[`q5#pG~aq*_j2ALh}OT_+`s8rŽD} 8MV(eJe1tȓroPݴfw+Q]4sw۲sVWsޥbo_~+,-ϫiE]|蝩+/f,_8cbhˮOtrvyqcVޕh5{mvE8 !0T<_J]P%UkvD<POrxqRE 3'bPL!J2j'lTdk:Ϩ)״x ?쐮T%\[;p-lr]վ6}fGv:(qg$%q T!* fPn*6`~V+^ek8G" <¦ᦪPҖ+XkKՌ|stL)"{DVǔZlԃ=F63RJBls!j&?eo{ZFi^VaVQNL#%>PM[׽d4hREl8FDD*LV9"6҆1L7zWԁ rڡ ǕJ</lOf· Z&]LZJ[R)<)4'D#] ػZ0XiIFacvᲒv]WKȐ{T|̑|շcDŽӣw\҆SIvײeθ߂4c:qAg'I\93,x74g1aX` ̓_NNWL_U._FAЩ{x2Y~ɾdq9RJr熲~ł5~* r>^;cIgǭ.':(CUN~+nݸGajKujQɺ̽dZrNݺFn h#W::U?MښusguթFv7Ȕ֬[}LZOքMg'][ҿD}kpV = d>;\ |4g7KSsz0 pV/WUR]̅vc ,;uY=382RmݝP|mT{LkKk=GW Va,%i myj#^݅=8*B:a}.l| MCqϳR"C@Tw:/v|0=>ۥk1,9o GfZHq,WꨰؼYKlˤ&~ͭ! UH7`MUrku< ^ `VrzЋH.fi}"O^Q:x,qz]~f?\h0"qFY&+Cq56+6.ܪH)!hR.CZҰTr6xh)ߧhʲЧAw_IN5[=&6".ӻ }J2ɝŭzY;A,RX6VJy׾Pwa]$=H*/*RmQ$Mv\{#˰m%6%;/\<\ջi1$Q(Ba L;|"JS#-m\GWxW <3wI{ bBpn'EUKUO~ihL$'_5eTx<*'r =5O+bGTu-r / \[ QL&qLHRL4J"N5JbahM)!G-PbaߴUU^ey|z \gkt)QOΧ&杙Q ۅ{i]D, sf;&U] M(%6RkBK~G+aCY}&Ny/K[Zh.,%X' 3S7ݓћ΋~Y~I惛nt ̂J^$b]:w=\.NW&ZYZ.0cQV#tMEc &HOmX0aQơep<I S]xdtkjFe,gqG̋3M XYuzxP`zywo \?şD|٫Do`OGa'ͽ[vomN_}k8sr0G%؉d_mH%ȇEZ6pfp8ѯIe;"%J % HH짪~eAsywԧɂvv_P?xt~h\6nzw?, w$_ __XIjO&[~`Ztۑ~y~JTybͲq|Yۏ6)| oiy95_5(mxZ693}]EY^M|$WroV᚟rWidb_f?e/} [tEǦV1Oimt~׫ষlj<4borj%/%'w?_(E6F6yoyCc-m.6Ǿ$ˏ7v|iaDž~ތ&|w7'63OƏ I dr[:tKwֈv[ftN36[^{Nve@0lb﾿{1Λ慭20LM{3W) j\UwG֦R挂T;s,뭠 s../(mwSLe;ΰ&+ŬZC [i{Q BbW bUPmloQ:juٹ[:juvhݶ{gNS@(); UIsu^O ϥKtR̮ܨ-LT#cd!4r@eZ v®{ƾ9/aJ+՘ z]5itvvHj7ٶi+r13g)x(]6vDعNgy;UR Z˞ l\۞}7En(n>yS(߹!HE +%6ܔG˛U_~k\hP)8Cfm>1|)MUfu47uu$izrv u)Tt4UÖzI]eW^,PIxdh;0jV\u\H!At2#&O<(Fd2{-P<7-jZ|ѧZݣJ9J3j=׵6`/Q㫗1ci{|*B'tgZD8\}5|gFϪ^zET$M4zAZX|*19FRt,Ic;z~UV[zUre }%G)8;R>@l"$J-+(|晅ȣJsFJ4pPaigIlHYRW40p*״p:bۊ=3N,]|#-9Nj.m)<ߪpV Z՗cUz3̫\IeNБw[~^n{ wШH ;;;ӂZ Oi;kNգ0Jsi!.+3#YB~--|7Vt÷-_ꬔ+ԷҞ>w{ `D­/wK*J ,cT@\9+cMM.c~!# 72Mf3;tE `.YYn#Z ,ޅX%^4R/L@c|>qtZlȺԷ$|a̘4@ )M͔QNIGj# !*腒J#ocQ>"Yf21X%6JNxXBHn+Z؝e8]}"7fhĄd1,{;_& wIKzGO%#CN ffd: |I8ӌ4+GBD2pjp[ Fh牀@nJ9@^qjzr דö); D \ӆFAY>4ŁLJ8:[cBLTWpFڽ }0YD ұP8 >A{#2LG2U'zhltVn9(e<_J:CS3݋;A9q;_^E˿JdJO N ׳Eʪhհ^='@..{IlTi,Q˨r gPiİ2:j쮦k> UD!%`è#42(mxwhdp\WM}[< ) c)GrFfd>Z#!N:TLs`\̗Ok%u!>1 q  &ӜVd9<SItJBZ+ӧuGJ<KB2:P.~ԹAꇰkP?+i3i@mB̶g3!@A97AوʽXQv""LXYf9##H"Q89JYXd^ }!>t$e)^SVfˑ`@ʖxKfVDF{hS?,혂a$Νgnîb ϢB^ńihxH<тOH)-UX-ge* ڹD9?0Lc2i&gT ND{:z$nۺa1_5B>*Z?4Tc@cx zc/PIs<DP`=1z/:+ lb R>S7d%:s{tUdB$,b8'{2VCBpjTd_ip:9[^·J'OXJȷ2o@[:LY`|q|'Bѵ3_Z@1FFZ-c>heaT_\ $pNc`̞+ ԲHyZVb]OĸT_\J }Hp1u!5=K}VɔŘ{ xg|&׺$cpY[*(?++(uVn~|b&9֓&R&̪qJOpi)|$r#8t/b̤E 2j$5*>42ŵ4wp_`$5PC}"̘W闩Vo\8 V=AC ¤hMVBz9ajr%Z2mZ_\qk-ʺ?`,@(6VD1tZ -ŜF.Lgh.P r7T[̾[ҥ徲 jV;8 ir f9qʹaB[+bUꂼͯgM10f",Qb]d0;FGi/z-QGu)nZK@3XEz"F|: hGkF'ݞc}{>& xCO-ǩ;'eZU&ES"`U$ɤPxax/N˨zHzt穐PiXfb.ʃDǶ a"]l=cXKQfe&)`vV)ѦD]Nle!(v?:hu@kC|fP=X7bx ٠$fՇl{llznP~1A5xl(YHm9A(ј]qI\dZ_6Zd(&yW%0 Us`KLܒ%Wb4xrH]НA.ٴ;!cD>'QZZ.3I">9(C1vgV:a`9cRјք)1[0OҚk+oKXn8亜I[QpWlDk%rJV;mI]̵z_1X0 粒sזes _dpl kq8|z%(z|v=էQ+d> =FR7g7{ozVx'l0) #*%c.*/UѦX~GoCVO,)UE'3 ŵ_Y#=pI MC[ay+_'A]4C'c,*tڷd quyAF^rQ4sAFa Z))>#i[7ZYi4NPZv1dX ;hBݪb0yb)<38WW6 𜮊Z2^nV.$lX>8h!$()4PIUɓӣmOv"Sq+7)m©Wˡڶ+C1(=?g#Y+w'/VXqE׋9'cE+sN#HNMl#ݚOp=~ 9˕e!j8^&Ǿ/ >yZV40[k3q\_.n+ԮRhm J6ڤRB~*u=Fmߥ|/V7#n֦;3wo5&‘Zv"I5ȴOAϕ{뷡s#yIJDf]5Z7JCfg%Bku?Gt9q+p㢓*HӒz(䨯uȜɣ hwη@ ?ykSUO:N&tZh8(#s"4 o"} pIL"],GG,-^ۥd '|aq '?| 2v>.GBk Cc+$v_~ Pr,SxbU,4}ҝSւ k q>~_VOV_szէQM+q7uO3V 9tէFұਥSKirku WwN/3/+)F-Uq6q']pD^8stT'O?Q7k,?QwYC-}yWf S@eD &kq9Y<1sZ|ZӇT#ak~Z5!VіjOK+\N/erhkni|^;s[akpJ`rb=* kaZ)u6u!%FBj# N|&Ѧt%ڷگo r8!'\2yij>N8)X?N7B崗Ldۯ 9實^p^ m%(IZxQnmZ[bZ }9@"]}wya:^|\A{1.wF65 ˲u>D2\ ۤJGu{+ݧtnKw.oSRj,ȗrWdK.(޽Z# w*"'M(^]McJ(BIApg[+Q% AJNv^EC_/5O~rS`V[6^w l=[he9Oɚ9+ JC#wk`r%%x1A]{%[ _l zmm7azwɖeQ1(ak ^֠tu_؁ %$^9/ˀW+~`AT}\y`Z V`h/.6Fڻik2}S}jZzZگ,7[ϩZ%(іOKI^K o_?ӿ{&w;KJR2eaOTCpE8$I@)W*ifk$ܠJU!v;f7 oƵLbChɊt+u.jпBEHדwnLGoq/.=,5/o zgd4le}'O>7Ƭ|Nԯo֩>|wW~c[5_}p5^974ެu+t_h3:L~ͥnWg+5u!_#\2||l,޽992bcR+VQ+:<[㒓֞QĿ{_iF4 U qo_Ϭ$z&}&-T%℆ma_4w_bWY,4^Q>50-7MO8V@ޟ^1[Vo{ܶAg:i/ɗK`,"^rs-H=(YAdm-r],&p \U6r8M`,7I~2؊ 1y1On&tl'Z-go3I`gQ }7/{J?Ĭ4|Eyye6lwl7>-lW1GGYWˤZۓd:^\UF0ZXs#3\J K#Αq+#feFbwYŏB!ȃ:boM!iwo<~qqo Cz| SABeBtfO~ _SN9GҁK]2X 18 /k=Nd2DQdj)9&,'T}9OEnm(R 3r㽝 /WחTTJ9*8-@Sˣ452"*2֑۳/7BWyBYѱJ݉h;'篹KP{Hk-.UߥڼA,L!A=bjX~@ɪ"dBe>l앧 0ݴE'j`nN&Ihrxzde|CLEI׌IOI']w>8Z/+ӄPM gd̤qlKQn.T&bn5VpJ> FQ+0ZLTfi 23e5g)A41 `i, Iqb.0| F #uG-)Bc33?{{ 4`YCҤnM0ԠDlyV!k76Yek\o1}2>.fjj33W;{^_/>"l=gBElösc,#)ւI,l3P#%̀uʵ%RnI[y¨0}^箍F(F4b1ȳQH#852 @T1LP46 !3"I`3&O +I]S$OGɗkכ9vo諸l^-Fy#;XV_^-$0Z1yqJ Fx^N|fˏ~޸eVC"B8o~@o`TfyyٻawR) շ{7?2#!F&/4O k98;(M`0cl| ID |42EKxwM)S0fl 0z!l\r}n4l%h`#ӔG?}UzsC֐ű3Θ2f6sϩ,-r u:w|֯fZUOb/?b a,a䭅eiM_3V^O"'yk.T6 `{# GLqřBq2&!hU4Wh3 7 TZ@)z]տ3sw|9 !)R$1IdsV/Q5'&.#um9#,H?{埗BJ.'kFŧ/-6Aqq%ykR0o `mZKNک 1[Cݞڐ(B,i}IA=Ƶ-ٓ&aGk~V.bFygO>C> _HБZ&؃ݒ7G-%yEd%o>+Pr\zIr^m\lKlȶe]MŸVnh$QE)ʻCM7EcḌ!i4t"lmn!jhݰͤd悐E~]?ñ^ޖfTlo,PjrMޓE޿>.}0G( ^;D:U5قn{,⦰ Nj.]չgl8B1@y^n*ỳֳ6V|-|S˷;zrœ=؏?{ LSܧژQ>n}+ڄ -1oϓ0!Ak{R pI KRqX ]L)e軛̰990%<ʩ=+jscֱ4 [d120 4=,lO"2 z]g4M LEa<Zʣ湕TTTxGu @g0~3qn>Uٙ8̋ٗi\Y|bvՍPnnC 14507ms (11:20:44.227) Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[502768675]: [14.507100201s] [14.507100201s] END Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.228391 4775 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.229230 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.230428 4775 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.230502 4775 trace.go:236] Trace[222843817]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 11:20:31.716) (total time: 12514ms): Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[222843817]: ---"Objects listed" error: 12514ms (11:20:44.230) Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[222843817]: [12.514110535s] [12.514110535s] END Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.230529 4775 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.230902 4775 trace.go:236] Trace[1111838512]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 11:20:31.712) (total time: 12518ms): Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[1111838512]: ---"Objects listed" error: 12518ms (11:20:44.230) Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[1111838512]: [12.518236096s] [12.518236096s] END Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.230924 4775 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.231700 4775 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.257717 4775 csr.go:261] certificate signing request csr-9nmvr is approved, waiting to be issued Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.265398 4775 csr.go:257] certificate signing request csr-9nmvr is issued Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.663055 4775 apiserver.go:52] "Watching apiserver" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.667138 4775 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.667513 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-vxn5f","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.667867 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668010 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668067 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668154 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.668152 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668226 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.668287 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668393 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668850 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.668932 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.672077 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.676180 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.676187 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.677009 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.677192 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.677561 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.677580 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.677773 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.678242 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.679238 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.679343 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.684069 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.686752 4775 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.692170 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 19:03:34.710087269 +0000 UTC Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.699934 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.729366 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734708 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734764 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734782 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734799 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734821 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734835 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734849 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734864 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734880 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734893 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734907 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734923 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734936 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734952 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734981 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734994 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735011 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735028 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735143 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735162 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735177 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735191 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735207 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735255 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735271 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735296 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735314 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735344 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735358 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735377 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735398 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735419 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735434 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735480 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735497 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735511 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735526 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735541 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735556 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735570 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735589 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735616 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735637 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735653 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735668 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735682 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735696 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735683 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735711 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735789 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735814 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735837 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735861 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735886 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735906 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735926 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735947 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735967 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735990 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736010 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736031 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736051 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736071 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736092 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736112 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736153 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736176 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736196 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736216 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736235 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736256 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736275 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736295 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736319 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736339 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736360 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736378 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736435 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736478 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736500 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736528 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736556 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736579 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736600 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736620 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736641 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736662 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736682 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736705 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736726 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736766 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736790 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736867 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736942 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736966 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736988 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737070 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737095 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737117 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737138 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737158 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737179 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737244 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737267 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737289 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737312 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737333 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737355 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737398 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737421 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737493 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737520 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737546 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737568 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737595 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737620 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737641 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737702 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737753 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737805 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737832 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737856 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737878 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737929 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737952 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737976 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738001 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738026 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738049 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738072 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738095 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738119 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738146 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738226 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738249 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738274 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738297 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738322 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738344 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738368 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738390 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738413 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738439 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738679 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738704 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738728 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738750 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738773 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738796 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738819 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738843 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738866 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738889 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738911 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738933 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738955 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738980 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739026 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739049 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739081 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739108 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739132 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739155 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739177 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739202 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739225 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739273 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739300 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739323 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739348 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739399 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739421 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739468 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739493 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739517 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739541 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739566 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739590 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739615 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739662 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739685 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739708 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739733 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739757 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739807 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758545 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737874 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738069 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738237 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738775 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738885 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739134 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.739883 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:20:45.239855338 +0000 UTC m=+24.381453205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.760298 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.762615 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.762682 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739935 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740223 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740304 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740371 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740598 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740635 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740688 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740876 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.741039 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.741072 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.741183 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.741232 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.743130 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.743980 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.744289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.744583 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.749307 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.749320 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.749580 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.749612 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.750152 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.750311 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.750705 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.751007 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.752059 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.752329 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.752529 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.752990 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.754812 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755017 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755089 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755614 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755682 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755789 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755950 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756291 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756299 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756369 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756547 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756905 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756925 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757117 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757138 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757293 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757316 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757601 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757764 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757875 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758088 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758158 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758174 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758266 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758377 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758474 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758687 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.775133 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.776541 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.776703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.776779 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.776856 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777009 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777257 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777288 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777267 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777275 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777371 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777620 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777688 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777717 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758748 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777880 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777932 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777981 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778043 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778194 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778306 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778317 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778310 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778525 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778784 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778764 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778985 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.779296 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.779914 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780029 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780042 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.779879 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780186 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780253 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780346 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780398 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780442 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780487 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780513 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780569 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780598 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780626 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780655 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780680 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780705 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c38486b-7aef-4d58-8637-207994a976d9-hosts-file\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780727 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2wmf\" (UniqueName: \"kubernetes.io/projected/0c38486b-7aef-4d58-8637-207994a976d9-kube-api-access-c2wmf\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780836 4775 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780853 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780868 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780882 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780894 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780512 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780571 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.780675 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780945 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.780997 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:45.28097637 +0000 UTC m=+24.422574217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780907 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780752 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781041 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781071 4775 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781102 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.781871 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.781933 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:45.281922086 +0000 UTC m=+24.423519873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781977 4775 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781996 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782010 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782026 4775 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782040 4775 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782054 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782067 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782080 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782094 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782108 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782120 4775 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782135 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782134 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782148 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782207 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782227 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782244 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782258 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782271 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782286 4775 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782301 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782316 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782329 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782343 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782357 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782356 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782371 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782386 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782399 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782412 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782425 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782439 4775 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782474 4775 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782487 4775 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782501 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782514 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782527 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782556 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782569 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782587 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782601 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782615 4775 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782628 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782641 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782653 4775 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782664 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782676 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782688 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782700 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782714 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782729 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782742 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782756 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782769 4775 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782780 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782792 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782804 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782816 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782827 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782923 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782943 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782945 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782956 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782982 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782997 4775 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783009 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783021 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783033 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783045 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783056 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783067 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783080 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783082 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783092 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783135 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783148 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783160 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783172 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783182 4775 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783193 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783202 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783212 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783222 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783232 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783242 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783360 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783370 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783380 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783391 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783402 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783412 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783427 4775 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783436 4775 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783461 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783470 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783479 4775 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783488 4775 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783496 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783153 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783073 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.784088 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.784191 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.785090 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.785463 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.786300 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.786646 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.786839 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.787123 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.787139 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.787571 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.787677 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.788211 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.788770 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.789103 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.789439 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.789508 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.789805 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.789810 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.790149 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.790677 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.790859 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.790877 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.790904 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.791334 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.791397 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.791663 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.791962 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.792000 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.792339 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.792712 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.792832 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793154 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793192 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793414 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.793520 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.793547 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.793562 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793567 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.793616 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:45.293598399 +0000 UTC m=+24.435196266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793736 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793851 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.794262 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.794361 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.794793 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795175 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795178 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795349 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795650 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795655 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795699 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.796002 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795980 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.796067 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.796211 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.796234 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.796247 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.796206 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.796295 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:45.2962764 +0000 UTC m=+24.437874177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.796326 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.796962 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.798224 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.798617 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.798950 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.798963 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.799165 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.799680 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.799750 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.799739 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800046 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800147 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800495 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800243 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800683 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800612 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800855 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800862 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800880 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800867 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800897 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.801617 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.801694 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.801718 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.801972 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.802008 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.802024 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.802372 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.802804 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.803436 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.805479 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.806036 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.806244 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.807535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.808390 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.809969 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.817812 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.820769 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.824073 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.830989 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.841138 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.848331 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884261 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884397 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884601 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884561 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884779 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c38486b-7aef-4d58-8637-207994a976d9-hosts-file\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884845 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2wmf\" (UniqueName: \"kubernetes.io/projected/0c38486b-7aef-4d58-8637-207994a976d9-kube-api-access-c2wmf\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885008 4775 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885034 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885069 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885083 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885094 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885105 4775 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885104 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c38486b-7aef-4d58-8637-207994a976d9-hosts-file\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885115 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885148 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885163 4775 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885173 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885186 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885196 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885229 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885243 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885254 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885267 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885277 4775 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885311 4775 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885324 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885335 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885355 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885370 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885379 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885388 4775 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885397 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885407 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885415 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885423 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885441 4775 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885463 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885471 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885479 4775 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885487 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885495 4775 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885502 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885510 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885519 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885526 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885534 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885542 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885549 4775 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885558 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885566 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885575 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885582 4775 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885590 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885604 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885613 4775 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885622 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885630 4775 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885639 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885648 4775 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885656 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885665 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885675 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885683 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885691 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885698 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885707 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885714 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885722 4775 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885731 4775 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885740 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885753 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885761 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885769 4775 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885778 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885786 4775 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885794 4775 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885802 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885810 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885819 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885827 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885835 4775 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885844 4775 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885853 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885860 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885868 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885876 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885884 4775 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885892 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885899 4775 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885907 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885915 4775 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885923 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885930 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885938 4775 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885946 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885954 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885963 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885974 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885983 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885991 4775 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885999 4775 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.902779 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2wmf\" (UniqueName: \"kubernetes.io/projected/0c38486b-7aef-4d58-8637-207994a976d9-kube-api-access-c2wmf\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.979865 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.988103 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: W0127 11:20:44.997963 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-024f8b40b424e2cec8aa68c61b3ab47a9cd6ab8d23347a81db63b205cdd93969 WatchSource:0}: Error finding container 024f8b40b424e2cec8aa68c61b3ab47a9cd6ab8d23347a81db63b205cdd93969: Status 404 returned error can't find the container with id 024f8b40b424e2cec8aa68c61b3ab47a9cd6ab8d23347a81db63b205cdd93969 Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.000609 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.008754 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:45 crc kubenswrapper[4775]: W0127 11:20:45.015020 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c38486b_7aef_4d58_8637_207994a976d9.slice/crio-74ab5ea2cf0a4be54ed63669cfd552952379470c18c5453408e37e3be8225f4e WatchSource:0}: Error finding container 74ab5ea2cf0a4be54ed63669cfd552952379470c18c5453408e37e3be8225f4e: Status 404 returned error can't find the container with id 74ab5ea2cf0a4be54ed63669cfd552952379470c18c5453408e37e3be8225f4e Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.267279 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-27 11:15:44 +0000 UTC, rotation deadline is 2026-11-19 17:32:48.613122666 +0000 UTC Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.267640 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7110h12m3.345485877s for next certificate rotation Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.289907 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.290018 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.290053 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:20:46.29002917 +0000 UTC m=+25.431626947 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.290085 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.290114 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.290144 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:46.290126522 +0000 UTC m=+25.431724379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.290207 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.290246 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:46.290239845 +0000 UTC m=+25.431837622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.390529 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.390580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390676 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390689 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390699 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390709 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390733 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390745 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390745 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:46.39073179 +0000 UTC m=+25.532329567 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390792 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:46.390780041 +0000 UTC m=+25.532377828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.693314 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 14:29:02.781739905 +0000 UTC Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.753952 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.754680 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.755732 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.756304 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.757260 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.757750 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.758328 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.759243 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.759831 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.760715 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.761305 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.762305 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.762777 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.763257 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.764105 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.764618 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.765567 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.766008 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.766557 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.767568 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.767991 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.768899 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.769330 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.770284 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.770767 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.771405 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.772663 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.773171 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.774339 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.774986 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.775932 4775 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.776040 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.778019 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.779248 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.779737 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.781550 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.782218 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.783080 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.783779 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.784985 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.785539 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.786906 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.787652 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.788850 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.789443 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.790561 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.791225 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.792820 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.793475 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.794321 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.794884 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.795845 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.796660 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.797211 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.847891 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-gm7w4"] Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.848166 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qn99x"] Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.848352 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.848410 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.848619 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dcnmf"] Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.849669 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852321 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852577 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852764 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852795 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852855 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852917 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.853990 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.854065 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.854078 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.854089 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.854364 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.855064 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.879812 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.881040 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9f343f88d507d23485006b27c70c1d80eaeefbc7c76be208875b5f630ef916a1"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.882611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.882686 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.882718 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"024f8b40b424e2cec8aa68c61b3ab47a9cd6ab8d23347a81db63b205cdd93969"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.883837 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.883877 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d8bd653fa4bf5063bd06102c9ce039294f6da0d28d294372f3deebdaf672147a"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.885834 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.886226 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.887979 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" exitCode=255 Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.888051 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.888129 4775 scope.go:117] "RemoveContainer" containerID="67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.889937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vxn5f" event={"ID":"0c38486b-7aef-4d58-8637-207994a976d9","Type":"ContainerStarted","Data":"7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.889984 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vxn5f" event={"ID":"0c38486b-7aef-4d58-8637-207994a976d9","Type":"ContainerStarted","Data":"74ab5ea2cf0a4be54ed63669cfd552952379470c18c5453408e37e3be8225f4e"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.893847 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.893888 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.893908 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-kubelet\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.893930 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-multus-certs\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.893978 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894007 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-cnibin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894049 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-bin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-hostroot\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894078 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cnibin\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894096 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6q4\" (UniqueName: \"kubernetes.io/projected/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-kube-api-access-mj6q4\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-socket-dir-parent\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894163 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-system-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894179 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-netns\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894196 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj4jn\" (UniqueName: \"kubernetes.io/projected/aba2edc6-0e64-4995-830d-e177919ea13e-kube-api-access-pj4jn\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-os-release\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894225 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-cni-binary-copy\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894300 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-k8s-cni-cncf-io\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894345 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tdkh\" (UniqueName: \"kubernetes.io/projected/7707cf23-0a23-4f57-8184-f7a4f7587aa2-kube-api-access-2tdkh\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894403 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-system-cni-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894429 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-multus\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894499 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-conf-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894643 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-etc-kubernetes\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894690 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-multus-daemon-config\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894949 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7707cf23-0a23-4f57-8184-f7a4f7587aa2-rootfs\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.895012 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7707cf23-0a23-4f57-8184-f7a4f7587aa2-proxy-tls\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.895034 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7707cf23-0a23-4f57-8184-f7a4f7587aa2-mcd-auth-proxy-config\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.895129 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.895203 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-os-release\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.898386 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.910757 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.920970 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.934051 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.949370 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.963294 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995161 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995626 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995661 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995684 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-kubelet\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995707 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-multus-certs\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995731 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-cnibin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995795 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-bin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995816 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-hostroot\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995835 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cnibin\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995838 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-multus-certs\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995861 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995857 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6q4\" (UniqueName: \"kubernetes.io/projected/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-kube-api-access-mj6q4\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995900 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-bin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995917 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-hostroot\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995924 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-socket-dir-parent\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-system-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995956 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cnibin\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995977 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-netns\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995955 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-cnibin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995999 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj4jn\" (UniqueName: \"kubernetes.io/projected/aba2edc6-0e64-4995-830d-e177919ea13e-kube-api-access-pj4jn\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996008 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-netns\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-socket-dir-parent\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996021 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-os-release\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996048 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-cni-binary-copy\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996074 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-k8s-cni-cncf-io\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996096 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tdkh\" (UniqueName: \"kubernetes.io/projected/7707cf23-0a23-4f57-8184-f7a4f7587aa2-kube-api-access-2tdkh\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996106 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-system-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996120 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-system-cni-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996143 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-multus\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996150 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-k8s-cni-cncf-io\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996166 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-conf-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996181 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-system-cni-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-conf-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996230 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-multus\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996238 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-etc-kubernetes\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996266 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-etc-kubernetes\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996291 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-multus-daemon-config\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996321 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7707cf23-0a23-4f57-8184-f7a4f7587aa2-rootfs\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996344 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7707cf23-0a23-4f57-8184-f7a4f7587aa2-proxy-tls\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996366 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7707cf23-0a23-4f57-8184-f7a4f7587aa2-mcd-auth-proxy-config\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996401 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7707cf23-0a23-4f57-8184-f7a4f7587aa2-rootfs\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-os-release\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996510 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-os-release\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997294 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7707cf23-0a23-4f57-8184-f7a4f7587aa2-mcd-auth-proxy-config\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997496 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-multus-daemon-config\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997500 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997533 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997515 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-os-release\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996644 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-kubelet\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.999129 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-cni-binary-copy\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.006294 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7707cf23-0a23-4f57-8184-f7a4f7587aa2-proxy-tls\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.020056 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.020410 4775 scope.go:117] "RemoveContainer" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.020710 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.021145 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.021877 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tdkh\" (UniqueName: \"kubernetes.io/projected/7707cf23-0a23-4f57-8184-f7a4f7587aa2-kube-api-access-2tdkh\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.021951 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6q4\" (UniqueName: \"kubernetes.io/projected/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-kube-api-access-mj6q4\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.039637 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj4jn\" (UniqueName: \"kubernetes.io/projected/aba2edc6-0e64-4995-830d-e177919ea13e-kube-api-access-pj4jn\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.061998 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.092649 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.113686 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.135869 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.147718 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.158774 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.161385 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gm7w4" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.170023 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.172414 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.176582 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:46 crc kubenswrapper[4775]: W0127 11:20:46.181635 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7707cf23_0a23_4f57_8184_f7a4f7587aa2.slice/crio-412506a9b82a5cd070407ab85f098969d63083ac651b0be6a3b9fb4107f70455 WatchSource:0}: Error finding container 412506a9b82a5cd070407ab85f098969d63083ac651b0be6a3b9fb4107f70455: Status 404 returned error can't find the container with id 412506a9b82a5cd070407ab85f098969d63083ac651b0be6a3b9fb4107f70455 Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.186292 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: W0127 11:20:46.187592 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod404c5bcc_dd1d_479b_8ce2_2b9fd6f2db9d.slice/crio-3c7803a6a3513b177a0362fb2939558506b8a446c20d328fb207dcfb42eb1ee7 WatchSource:0}: Error finding container 3c7803a6a3513b177a0362fb2939558506b8a446c20d328fb207dcfb42eb1ee7: Status 404 returned error can't find the container with id 3c7803a6a3513b177a0362fb2939558506b8a446c20d328fb207dcfb42eb1ee7 Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.201948 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.214036 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nzthg"] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.214820 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.217967 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.218082 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.217972 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.218157 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.218107 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.218255 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.221811 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.232215 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.245158 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.264900 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.275828 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.292192 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.298245 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:20:48.298217352 +0000 UTC m=+27.439815119 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298718 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298865 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298893 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298916 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298939 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298964 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299000 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299041 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299067 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299088 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.299068 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.299147 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:48.299138626 +0000 UTC m=+27.440736403 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299201 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299241 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299256 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299277 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299319 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299343 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299361 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299385 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299407 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.299424 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299464 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czdm4\" (UniqueName: \"kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.299547 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:48.299535497 +0000 UTC m=+27.441133364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.309643 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.329601 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.347047 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.362201 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.373420 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.387345 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400466 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400511 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400538 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400563 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400585 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400619 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400626 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400645 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400696 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400697 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400760 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400784 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400797 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400835 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.400838 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.400872 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.400886 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.400935 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:48.400917825 +0000 UTC m=+27.542515802 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401310 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401350 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400672 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401402 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400811 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401774 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401839 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401871 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401889 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401894 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401922 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401958 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401985 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401989 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402012 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402013 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402031 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401961 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.402155 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.402178 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.402191 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402196 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.402239 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:48.40222316 +0000 UTC m=+27.543821127 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402265 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czdm4\" (UniqueName: \"kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402296 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402360 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402556 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.405840 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:28Z\\\",\\\"message\\\":\\\"W0127 11:20:27.916104 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 11:20:27.916390 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769512827 cert, and key in /tmp/serving-cert-3044175487/serving-signer.crt, /tmp/serving-cert-3044175487/serving-signer.key\\\\nI0127 11:20:28.348355 1 observer_polling.go:159] Starting file observer\\\\nW0127 11:20:28.350725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 11:20:28.350887 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:28.352887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3044175487/tls.crt::/tmp/serving-cert-3044175487/tls.key\\\\\\\"\\\\nF0127 11:20:28.690147 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.410148 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.414107 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.418723 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.423908 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czdm4\" (UniqueName: \"kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.427523 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.427796 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.439628 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.453725 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:28Z\\\",\\\"message\\\":\\\"W0127 11:20:27.916104 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 11:20:27.916390 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769512827 cert, and key in /tmp/serving-cert-3044175487/serving-signer.crt, /tmp/serving-cert-3044175487/serving-signer.key\\\\nI0127 11:20:28.348355 1 observer_polling.go:159] Starting file observer\\\\nW0127 11:20:28.350725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 11:20:28.350887 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:28.352887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3044175487/tls.crt::/tmp/serving-cert-3044175487/tls.key\\\\\\\"\\\\nF0127 11:20:28.690147 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.467318 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.482650 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.496306 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.513620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.532486 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.533743 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.557172 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.578754 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.595514 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.610746 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.628814 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:28Z\\\",\\\"message\\\":\\\"W0127 11:20:27.916104 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 11:20:27.916390 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769512827 cert, and key in /tmp/serving-cert-3044175487/serving-signer.crt, /tmp/serving-cert-3044175487/serving-signer.key\\\\nI0127 11:20:28.348355 1 observer_polling.go:159] Starting file observer\\\\nW0127 11:20:28.350725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 11:20:28.350887 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:28.352887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3044175487/tls.crt::/tmp/serving-cert-3044175487/tls.key\\\\\\\"\\\\nF0127 11:20:28.690147 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.646063 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.660976 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.673201 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.685541 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.694523 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 16:17:34.211602619 +0000 UTC Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.698221 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.710136 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.721434 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.735338 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.744349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.744412 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.744369 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.744504 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.744663 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.744866 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.749613 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.765685 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.791175 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.817187 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.894120 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.894185 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.894201 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"412506a9b82a5cd070407ab85f098969d63083ac651b0be6a3b9fb4107f70455"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.895272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerStarted","Data":"e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.895296 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerStarted","Data":"88c6cc63acfc378ddb4f98b32e64b6cc2284716135203b6082128b8b97604592"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.897178 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c" exitCode=0 Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.897242 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.897270 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerStarted","Data":"3c7803a6a3513b177a0362fb2939558506b8a446c20d328fb207dcfb42eb1ee7"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.899612 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.902528 4775 scope.go:117] "RemoveContainer" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.902791 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.903434 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5" exitCode=0 Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.903481 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.903527 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"f888bf350c80a3614a432edcc4a4b855273dcb2c8f4a4adedcb465a13b969229"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.910745 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.928847 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.945057 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.954903 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.978888 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.993899 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.006987 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.021337 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.039296 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.053748 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:28Z\\\",\\\"message\\\":\\\"W0127 11:20:27.916104 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 11:20:27.916390 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769512827 cert, and key in /tmp/serving-cert-3044175487/serving-signer.crt, /tmp/serving-cert-3044175487/serving-signer.key\\\\nI0127 11:20:28.348355 1 observer_polling.go:159] Starting file observer\\\\nW0127 11:20:28.350725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 11:20:28.350887 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:28.352887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3044175487/tls.crt::/tmp/serving-cert-3044175487/tls.key\\\\\\\"\\\\nF0127 11:20:28.690147 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.071622 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.083106 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.096016 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.126207 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.129676 4775 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.183906 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.220004 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.259537 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.299633 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.341025 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.379144 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.418536 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.459113 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.497137 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.539353 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.578385 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.616682 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.694858 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 05:37:20.085252434 +0000 UTC Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.909977 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911063 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044" exitCode=1 Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911188 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911200 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911208 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911216 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911225 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.912967 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerStarted","Data":"7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.914025 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.925264 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.936250 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.949652 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.965472 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.976836 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.987749 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.998715 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.008786 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.022617 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.036228 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.059936 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.099185 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.141583 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.182804 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.219758 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.259436 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.309046 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.320012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.320162 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.320218 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:20:52.320187536 +0000 UTC m=+31.461785333 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.320275 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.320283 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.320354 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:52.32033413 +0000 UTC m=+31.461931967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.320428 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.320503 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:52.320492084 +0000 UTC m=+31.462089941 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.345667 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.378816 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.421013 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.421064 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421205 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421223 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421235 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421235 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421257 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421268 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421289 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:52.421271786 +0000 UTC m=+31.562869563 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421319 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:52.421299537 +0000 UTC m=+31.562897324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.423107 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.430110 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9dz9r"] Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.430467 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.450145 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.470386 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.490936 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.510749 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.521618 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c1ce49b6-6832-4f61-bad3-63174f36eba9-serviceca\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.521656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1ce49b6-6832-4f61-bad3-63174f36eba9-host\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.521677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhgjw\" (UniqueName: \"kubernetes.io/projected/c1ce49b6-6832-4f61-bad3-63174f36eba9-kube-api-access-hhgjw\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.538079 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.578931 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.617814 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.621948 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1ce49b6-6832-4f61-bad3-63174f36eba9-host\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.621982 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhgjw\" (UniqueName: \"kubernetes.io/projected/c1ce49b6-6832-4f61-bad3-63174f36eba9-kube-api-access-hhgjw\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.622051 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c1ce49b6-6832-4f61-bad3-63174f36eba9-serviceca\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.622073 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1ce49b6-6832-4f61-bad3-63174f36eba9-host\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.623112 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c1ce49b6-6832-4f61-bad3-63174f36eba9-serviceca\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.669033 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhgjw\" (UniqueName: \"kubernetes.io/projected/c1ce49b6-6832-4f61-bad3-63174f36eba9-kube-api-access-hhgjw\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.679873 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.696019 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 22:51:59.257235401 +0000 UTC Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.718966 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.744427 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.744464 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.744531 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.744548 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.744627 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.744678 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.757195 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.798462 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.837859 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.866315 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.892105 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.919036 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.920099 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936" exitCode=0 Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.920169 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936"} Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.921358 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9dz9r" event={"ID":"c1ce49b6-6832-4f61-bad3-63174f36eba9","Type":"ContainerStarted","Data":"f56e3b2430b7fad2b35329c5a732c98eed1bee3e7a01738e269bbfc1ebe4672d"} Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.958595 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.999093 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.037327 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.079785 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.118898 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.158308 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.199739 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.245073 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.276605 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.320306 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.358966 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.397983 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.440685 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.479904 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.522054 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.557072 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.599361 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.639016 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.676453 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.697033 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 21:24:10.303393206 +0000 UTC Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.721498 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.759146 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.798360 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.838641 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.877234 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.926691 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0" exitCode=0 Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.926767 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0"} Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.928396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9dz9r" event={"ID":"c1ce49b6-6832-4f61-bad3-63174f36eba9","Type":"ContainerStarted","Data":"280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d"} Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.939316 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.961102 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.999426 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.040838 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.081042 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.127038 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.159813 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.205842 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.241576 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.283261 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.323426 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.362314 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.403263 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.443496 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.485474 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.519736 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.563357 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.597251 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.642964 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.680542 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.697947 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 21:58:00.141605215 +0000 UTC Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.725902 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.744981 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.745079 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:50 crc kubenswrapper[4775]: E0127 11:20:50.745125 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.745089 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:50 crc kubenswrapper[4775]: E0127 11:20:50.745390 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:50 crc kubenswrapper[4775]: E0127 11:20:50.745555 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.761259 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.804179 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.840453 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.880101 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.922902 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.934185 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c" exitCode=0 Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.934268 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c"} Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.937576 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.939131 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072"} Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.959164 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.000075 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.039917 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.078185 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.119643 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.156955 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.199966 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.229461 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.233896 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.233974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.233996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.234541 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.239558 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.291431 4775 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.291759 4775 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.292768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.292806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.292815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.292830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.292839 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.304049 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.308330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.308371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.308382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.308397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.308409 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.320793 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.320974 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.323906 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.323938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.323948 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.323960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.323970 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.337525 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.340930 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.340959 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.340970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.340984 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.340993 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.353988 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.357424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.357465 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.357494 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.357512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.357525 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.361320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.369153 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.369405 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.371159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.371200 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.371213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.371232 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.371255 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.405066 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.437874 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.467364 4775 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.481563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.481658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.481685 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.481719 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.481743 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.501606 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.521304 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.559993 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.584708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.584749 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.584759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.584776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.584788 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.598771 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.687834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.687877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.687893 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.687916 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.687932 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.698743 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 17:26:23.976860249 +0000 UTC Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.754275 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.764161 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.776521 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.785300 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.790122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.790154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.790163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.790176 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.790187 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.796729 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.842119 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.885833 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.892677 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.892716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.892724 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.892739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.892749 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.922799 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.943320 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e" exitCode=0 Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.943354 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.967412 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.994779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.995008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.995130 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.995251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.995374 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.997695 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.039672 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.077214 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.098075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.098120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.098133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.098151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.098162 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.118507 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.159538 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.213984 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.217770 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.217815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.217831 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.217850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.217863 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.243982 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.279288 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.318908 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.320550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.320577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.320586 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.320599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.320610 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.356951 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.357109 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.357186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.357267 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.357345 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.357323132 +0000 UTC m=+39.498920919 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.357273 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.357427 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.357359013 +0000 UTC m=+39.498956890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.357473 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.357443135 +0000 UTC m=+39.499040992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.359332 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.400046 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.422916 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.422947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.422954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.422969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.422978 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.444283 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.457583 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.457640 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457772 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457961 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457970 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457981 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457985 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457996 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.458049 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.458023282 +0000 UTC m=+39.599621059 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.458065 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.458058913 +0000 UTC m=+39.599656690 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.476843 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.519793 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.525510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.525532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.525540 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.525552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.525560 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.559497 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.605890 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.627919 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.627966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.627978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.628001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.628014 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.638875 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.687887 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.699344 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 20:14:53.691746542 +0000 UTC Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.719015 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.730481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.730512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.730521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.730533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.730541 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.743801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.743881 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.744058 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.744102 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.744120 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.744223 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.833684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.833721 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.833729 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.833745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.833758 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.936763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.936823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.936842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.936866 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.936883 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.950589 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.951362 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.951564 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.951596 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.951711 4775 scope.go:117] "RemoveContainer" containerID="da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.956529 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda" exitCode=0 Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.956574 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.968071 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.983823 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.985815 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.986912 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.996394 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.010030 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.024849 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.039724 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.039760 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.039768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.039781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.039790 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.041014 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.058698 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.079655 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.091196 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.118628 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.142076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.142120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.142133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.142148 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.142159 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.162983 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.197569 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.237356 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.244535 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.244577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.244594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.244616 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.244634 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.282639 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.319877 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.346745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.346780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.346791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.346806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.346814 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.359113 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.398110 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.403386 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.404605 4775 scope.go:117] "RemoveContainer" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" Jan 27 11:20:53 crc kubenswrapper[4775]: E0127 11:20:53.404880 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.438572 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.449541 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.449577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.449591 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.449608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.449619 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.481007 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.519091 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.553009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.553325 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.553497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.553664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.553789 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.567759 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.604986 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.645712 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.657341 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.657372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.657382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.657394 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.657402 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.681168 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.700039 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 10:51:39.888059413 +0000 UTC Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.722881 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.759165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.759210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.759227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.759247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.759264 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.765342 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.800106 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.838093 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.861655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.861695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.861706 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.861725 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.861738 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.964416 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerStarted","Data":"09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.965393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.965431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.965454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.965494 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.965509 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.971367 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.972468 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.972556 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.984872 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.002740 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.015481 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.026936 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.040796 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.068078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.068141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.068164 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.068196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.068218 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.078302 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.120035 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.157046 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.171293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.171357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.171377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.171401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.171417 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.198941 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.240355 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.274328 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.274371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.274384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.274401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.274412 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.283647 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.322285 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.366253 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.376837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.377056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.377071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.377090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.377104 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.397267 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.439255 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.475722 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.479188 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.479215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.479223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.479237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.479247 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.524260 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.562618 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.582249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.582294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.582308 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.582343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.582357 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.602899 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.642861 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685495 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685542 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685553 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685587 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.700588 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 01:07:34.85492499 +0000 UTC Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.720437 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.744900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.744898 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:54 crc kubenswrapper[4775]: E0127 11:20:54.745058 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:54 crc kubenswrapper[4775]: E0127 11:20:54.745148 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.744903 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:54 crc kubenswrapper[4775]: E0127 11:20:54.745338 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.758342 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.793106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.793188 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.793213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.793245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.793271 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.808603 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.861505 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.880133 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.896676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.896748 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.896775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.896805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.896827 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.958188 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.974109 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.975516 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.998742 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.998772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.998781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.998794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.998803 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.101705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.101759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.101772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.101792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.101808 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.204611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.204654 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.204664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.204680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.204691 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.306768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.306828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.306837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.306852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.306861 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.409753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.409808 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.409817 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.409837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.409846 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.512252 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.512294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.512304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.512320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.512330 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.614804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.614854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.614865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.614885 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.614898 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.701388 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 11:32:35.229483247 +0000 UTC Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.717858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.717894 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.717938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.717953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.717987 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.820388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.820426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.820436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.820477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.820493 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.923331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.923403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.923428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.923464 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.923517 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.982903 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/0.log" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.985871 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.987209 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba" exitCode=1 Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.987275 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.988386 4775 scope.go:117] "RemoveContainer" containerID="0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.010655 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.030969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.031066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.031423 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.031511 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.032419 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.032620 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.052031 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.067928 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.090039 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.113111 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.129620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.135488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.135676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.135841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.135979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.136099 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.146068 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.162885 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.182566 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.203525 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.222250 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.239132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.239187 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.239204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.239228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.239246 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.247090 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.261053 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.342070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.342329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.342404 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.342509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.342584 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.444640 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.444710 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.444732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.444754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.444772 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.547337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.547376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.547387 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.547402 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.547413 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.649758 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.649852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.649873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.649901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.649921 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.702219 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 10:05:49.110974385 +0000 UTC Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.744802 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.744890 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:56 crc kubenswrapper[4775]: E0127 11:20:56.744954 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:56 crc kubenswrapper[4775]: E0127 11:20:56.745085 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.744836 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:56 crc kubenswrapper[4775]: E0127 11:20:56.745223 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.753210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.753248 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.753259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.753275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.753289 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.856053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.856081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.856089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.856101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.856110 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.958412 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.958453 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.958476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.958489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.958497 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.992616 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/0.log" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.994863 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.995372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.995495 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.014725 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.027942 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.042948 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.055759 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.060592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.060627 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.060638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.060653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.060663 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.068563 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.081791 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.092143 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.104320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.115104 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.127355 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.138854 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.151087 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.163059 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.163099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.163108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.163123 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.163134 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.168104 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.177864 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.266743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.266779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.266793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.266810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.266821 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.369313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.369617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.369678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.369741 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.369796 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.472170 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.472206 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.472222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.472244 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.472258 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.574838 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.574878 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.574886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.574901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.574911 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.677136 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.677307 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.677384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.677467 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.677524 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.703301 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 04:39:47.626925617 +0000 UTC Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.780691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.780754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.780772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.780795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.780814 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.883366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.883446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.883512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.883546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.883577 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.952661 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5"] Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.953239 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.955210 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.958133 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.978572 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.985744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.985811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.985834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.985862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.985888 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.993586 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.000570 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/1.log" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.001962 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/0.log" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.004693 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.007768 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79" exitCode=1 Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.007812 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.007848 4775 scope.go:117] "RemoveContainer" containerID="0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.008535 4775 scope.go:117] "RemoveContainer" containerID="437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79" Jan 27 11:20:58 crc kubenswrapper[4775]: E0127 11:20:58.008670 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.013512 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.013601 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjhfr\" (UniqueName: \"kubernetes.io/projected/722c4ef1-b8ec-4732-908b-4c697d7eef60-kube-api-access-mjhfr\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.013733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.013782 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.017801 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.040416 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.052812 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.068629 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.086919 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.088360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.088415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.088434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.088499 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.088518 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.098495 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.111634 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.114530 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.114558 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjhfr\" (UniqueName: \"kubernetes.io/projected/722c4ef1-b8ec-4732-908b-4c697d7eef60-kube-api-access-mjhfr\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.114605 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.114628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.115410 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.115775 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.120064 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.125599 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.134898 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjhfr\" (UniqueName: \"kubernetes.io/projected/722c4ef1-b8ec-4732-908b-4c697d7eef60-kube-api-access-mjhfr\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.138443 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.150521 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.162877 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.173397 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.185026 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.190996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.191063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.191081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.191106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.191122 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.205427 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.234017 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.246122 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.262688 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.277052 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.277104 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293139 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293832 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: W0127 11:20:58.305481 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod722c4ef1_b8ec_4732_908b_4c697d7eef60.slice/crio-b1def4c01e4a0ac6a8957def6b7649bc32727987ca64717d4772b4fdd26da4af WatchSource:0}: Error finding container b1def4c01e4a0ac6a8957def6b7649bc32727987ca64717d4772b4fdd26da4af: Status 404 returned error can't find the container with id b1def4c01e4a0ac6a8957def6b7649bc32727987ca64717d4772b4fdd26da4af Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.306592 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.323866 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.336059 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.349705 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.399296 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.405329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.405390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.405404 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.405428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.405443 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.419008 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.428893 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.442866 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.453086 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.508563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.508615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.508633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.508657 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.508674 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.611228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.611275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.611289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.611305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.611318 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.704390 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 11:43:11.587026499 +0000 UTC Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.713574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.713629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.713641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.713664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.713679 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.744953 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:58 crc kubenswrapper[4775]: E0127 11:20:58.745274 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.744971 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:58 crc kubenswrapper[4775]: E0127 11:20:58.745395 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.745664 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:58 crc kubenswrapper[4775]: E0127 11:20:58.745871 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.816134 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.816170 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.816180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.816194 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.816203 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.918837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.918895 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.918915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.918981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.919004 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.015133 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" event={"ID":"722c4ef1-b8ec-4732-908b-4c697d7eef60","Type":"ContainerStarted","Data":"971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.015192 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" event={"ID":"722c4ef1-b8ec-4732-908b-4c697d7eef60","Type":"ContainerStarted","Data":"b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.015209 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" event={"ID":"722c4ef1-b8ec-4732-908b-4c697d7eef60","Type":"ContainerStarted","Data":"b1def4c01e4a0ac6a8957def6b7649bc32727987ca64717d4772b4fdd26da4af"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.018420 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/1.log" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.021691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.021724 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.021735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.021750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.021760 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.027925 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.034208 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.054268 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.085496 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.093780 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-b48nk"] Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.094234 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: E0127 11:20:59.094304 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.100225 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.114515 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.123748 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5frt\" (UniqueName: \"kubernetes.io/projected/c945c8b1-655c-4522-b703-0c5b9b8fcf38-kube-api-access-m5frt\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.123862 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.124901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.124952 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.124969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.124993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.125008 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.129053 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.140049 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.152477 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.167883 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.182653 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.196310 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.207965 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.221187 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.224574 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.224612 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5frt\" (UniqueName: \"kubernetes.io/projected/c945c8b1-655c-4522-b703-0c5b9b8fcf38-kube-api-access-m5frt\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: E0127 11:20:59.224741 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:20:59 crc kubenswrapper[4775]: E0127 11:20:59.224805 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:59.724788057 +0000 UTC m=+38.866385844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.226880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.226918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.226931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.226949 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.226960 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.232595 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.240977 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.246720 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5frt\" (UniqueName: \"kubernetes.io/projected/c945c8b1-655c-4522-b703-0c5b9b8fcf38-kube-api-access-m5frt\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.255679 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.269106 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.284573 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.297536 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.314370 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.329626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.329666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.329678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.329702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.329717 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.349697 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.359860 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.373283 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.384873 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.395794 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.406687 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.419659 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.431487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.431543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.431555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.431572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.431586 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.436011 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.449361 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.466024 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.478138 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.534216 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.534263 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.534277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.534295 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.534307 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.637329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.637400 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.637418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.637449 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.637494 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.704767 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 02:28:59.817281931 +0000 UTC Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.729771 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: E0127 11:20:59.729919 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:20:59 crc kubenswrapper[4775]: E0127 11:20:59.729993 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.729974002 +0000 UTC m=+39.871571809 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.740428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.740515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.740533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.740563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.740581 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.842498 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.842563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.842580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.842604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.842626 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.945909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.945969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.945991 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.946022 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.946045 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.048654 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.048720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.048743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.048777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.048800 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.151065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.151125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.151141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.151167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.151183 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.253754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.253820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.253834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.253850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.253864 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.356989 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.357079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.357097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.357122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.357141 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.436928 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.437065 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.437198 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:21:16.437159884 +0000 UTC m=+55.578757701 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.437221 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.437291 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:16.437269157 +0000 UTC m=+55.578867034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.437284 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.437368 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.437426 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:16.43741329 +0000 UTC m=+55.579011097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.460790 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.460858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.460879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.460904 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.460922 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.538240 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.538339 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538509 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538550 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538567 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538570 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538597 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538617 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538634 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:16.538614604 +0000 UTC m=+55.680212391 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538684 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:16.538662295 +0000 UTC m=+55.680260102 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.563975 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.564043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.564062 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.564090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.564109 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.668847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.669116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.669187 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.669310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.669407 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.705633 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 23:37:52.188041059 +0000 UTC Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.740949 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.741125 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.741200 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:02.741177355 +0000 UTC m=+41.882775172 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.744845 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.744954 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.745047 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.745078 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.745161 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.745166 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.745242 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.745379 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.772772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.772849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.772875 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.772906 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.772932 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.876994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.877093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.877111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.877135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.877154 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.979719 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.979784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.979801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.979825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.979841 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.082758 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.082811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.082855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.082880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.082898 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.185994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.186092 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.186113 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.186137 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.186154 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.288850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.288895 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.288906 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.288928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.288940 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.392053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.392113 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.392125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.392144 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.392155 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.495231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.495286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.495297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.495314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.495333 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.597609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.597958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.597974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.597991 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.598001 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.640754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.640849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.640868 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.640893 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.640910 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.662237 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.667199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.667249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.667265 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.667290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.667309 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.687312 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.691368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.691418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.691443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.691521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.691546 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.706445 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 19:08:41.491120866 +0000 UTC Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.710921 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.714439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.714547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.714565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.714592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.714611 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.732559 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.736398 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.736426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.736434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.736451 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.736474 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.752689 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.752838 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.754071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.754112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.754124 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.754139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.754150 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.759302 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.773542 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.785701 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.800853 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.811608 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.823171 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.834699 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.853600 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.855702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.855746 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.855755 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.855771 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.855779 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.866836 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.883338 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.895379 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.908618 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.921929 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.934181 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.943814 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.953653 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.958096 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.958127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.958135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.958149 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.958159 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.060129 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.060193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.060215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.060242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.060295 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.162894 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.162929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.162938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.162972 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.162982 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.264789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.264840 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.264851 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.264869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.264879 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.368163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.368207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.368225 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.368247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.368264 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.470274 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.470561 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.470639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.470708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.470769 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.574114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.574509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.574688 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.574842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.574980 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.678146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.678230 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.678281 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.678308 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.678330 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.707484 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 22:44:13.208037678 +0000 UTC Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.744933 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.745372 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.744993 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.745670 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.744948 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.745912 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.745017 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.746227 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.762990 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.763151 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.763228 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:06.763205481 +0000 UTC m=+45.904803258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.780936 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.781121 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.781209 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.781297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.781416 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.884111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.884162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.884179 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.884202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.884220 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.988083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.988120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.988130 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.988142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.988150 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.091381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.091446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.091503 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.091525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.091542 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.194410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.194486 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.194504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.194529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.194547 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.297385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.297426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.297436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.297470 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.297480 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.400387 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.400446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.400509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.400539 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.400559 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.503326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.503416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.503442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.503514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.503539 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.606972 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.607048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.607066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.607091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.607117 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.708043 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 19:53:57.558113556 +0000 UTC Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.709783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.709816 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.709828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.709844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.709856 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.812621 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.812676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.812698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.812727 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.812750 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.915746 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.915828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.915852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.915882 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.915907 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.018784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.018839 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.018851 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.018870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.018882 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.121779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.121856 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.121875 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.121905 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.121923 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.224550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.224601 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.224619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.224641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.224659 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.327598 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.327633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.327643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.327660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.327670 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.430966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.431017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.431028 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.431045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.431056 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.533321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.533377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.533394 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.533416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.533433 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.636010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.636077 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.636095 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.636119 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.636136 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.708618 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 11:14:20.711132414 +0000 UTC Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.738607 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.738633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.738647 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.738660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.738668 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.744128 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.744146 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.744470 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.744510 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:04 crc kubenswrapper[4775]: E0127 11:21:04.744558 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:04 crc kubenswrapper[4775]: E0127 11:21:04.744746 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:04 crc kubenswrapper[4775]: E0127 11:21:04.744950 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:04 crc kubenswrapper[4775]: E0127 11:21:04.745024 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.746078 4775 scope.go:117] "RemoveContainer" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.841714 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.841744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.841753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.841768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.841778 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.944041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.944070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.944083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.944098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.944107 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.045988 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.046036 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.046054 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.046074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.046090 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.050098 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.051329 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.051593 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.067405 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.078926 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.088630 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.099464 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.108066 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.127320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.145010 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.148600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.148635 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.148646 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.148661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.148672 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.167070 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.179010 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.190654 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.205022 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.217357 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.228926 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.239846 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.250538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.250569 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.250581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.250597 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.250608 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.253296 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.266009 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.353304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.353368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.353391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.353418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.353438 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.456120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.456177 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.456194 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.456219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.456236 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.559031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.559064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.559073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.559088 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.559098 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.662055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.662124 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.662142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.662166 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.662186 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.709345 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 19:57:16.405542846 +0000 UTC Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.763892 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.763961 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.763977 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.763997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.764015 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.866999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.867045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.867054 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.867071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.867082 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.969812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.969870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.969909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.969953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.969964 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.072166 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.072224 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.072241 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.072291 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.072308 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.174332 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.174371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.174381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.174396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.174407 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.276308 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.276366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.276382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.276405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.276422 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.378983 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.379015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.379023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.379034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.379043 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.480527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.480559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.480567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.480580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.480588 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.582935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.582973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.582983 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.582996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.583005 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.685584 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.685624 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.685636 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.685652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.685664 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.710203 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 01:23:09.091822811 +0000 UTC Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.744690 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.744722 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.744762 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.744708 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.744815 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.744883 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.744953 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.745010 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.788947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.788992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.789001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.789015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.789025 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.806937 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.807163 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.807267 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:14.807239912 +0000 UTC m=+53.948837729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.891408 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.891443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.891468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.891482 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.891490 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.993631 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.993954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.994147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.994361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.994561 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.097925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.097992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.098000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.098012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.098021 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.201640 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.201680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.201693 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.201709 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.201719 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.305440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.305626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.305657 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.305681 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.305698 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.409293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.409370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.409395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.409424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.409443 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.513619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.513696 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.513718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.513745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.513766 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.616699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.616792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.616820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.616854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.616877 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.710796 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 12:16:38.206933187 +0000 UTC Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.719944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.720004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.720023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.720048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.720065 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.822606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.822695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.822712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.822736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.822753 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.925702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.925779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.925802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.925832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.925855 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.028771 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.028843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.028861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.028891 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.028927 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.132407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.132560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.132630 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.132668 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.132738 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.235420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.235516 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.235540 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.235568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.235588 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.338668 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.338734 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.338751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.338773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.338790 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.442114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.442202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.442220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.442242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.442261 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.544683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.544751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.544774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.544805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.544832 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.648236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.648365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.648389 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.648419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.648438 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.711862 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 12:50:05.020245952 +0000 UTC Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.744617 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.744691 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.744762 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:08 crc kubenswrapper[4775]: E0127 11:21:08.744796 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.744653 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:08 crc kubenswrapper[4775]: E0127 11:21:08.744986 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:08 crc kubenswrapper[4775]: E0127 11:21:08.745077 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:08 crc kubenswrapper[4775]: E0127 11:21:08.745287 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.751603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.751656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.751707 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.751772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.751792 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.854940 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.855000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.855018 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.855042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.855060 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.957912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.957990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.958009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.958029 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.958043 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.061368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.061442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.061497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.061525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.061542 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.163947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.163998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.164008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.164024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.164032 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.267255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.267347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.267366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.267391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.267410 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.369998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.370045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.370057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.370075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.370087 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.472923 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.473044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.473067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.473140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.473159 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.576287 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.576417 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.576444 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.576510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.576532 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.679997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.680063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.680083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.680108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.680126 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.712654 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 12:17:45.66238471 +0000 UTC Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.781999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.782066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.782076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.782092 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.782103 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.885071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.885143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.885161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.885732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.885795 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.988373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.988421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.988432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.988475 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.988488 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.090798 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.090859 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.090877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.090901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.090920 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.194007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.194103 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.194152 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.194185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.194208 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.296985 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.297042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.297055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.297074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.297087 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.400282 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.400340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.400351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.400368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.400382 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.502941 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.503061 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.503080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.503104 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.503122 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.606256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.606313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.606327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.606344 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.606356 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.708643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.708703 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.708718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.708743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.708760 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.713203 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 01:19:07.359072324 +0000 UTC Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.744613 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.744662 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.744709 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.744678 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:10 crc kubenswrapper[4775]: E0127 11:21:10.744833 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:10 crc kubenswrapper[4775]: E0127 11:21:10.744978 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:10 crc kubenswrapper[4775]: E0127 11:21:10.745060 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:10 crc kubenswrapper[4775]: E0127 11:21:10.745172 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.811747 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.811783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.811792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.811806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.811818 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.914259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.914353 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.914391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.914426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.914489 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.017525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.017616 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.017634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.017663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.017682 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.120382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.120424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.120432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.120444 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.120471 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.222705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.222735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.222744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.222757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.222767 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.325577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.325642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.325659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.325685 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.325702 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.428974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.429040 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.429056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.429079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.429097 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.453944 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.466753 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.478773 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.498942 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.521006 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.531007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.531043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.531052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.531069 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.531079 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.535109 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.547238 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.556614 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.567389 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.578852 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.600106 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.621873 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.633786 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.633825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.633836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.633853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.633865 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.645032 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.672020 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.692647 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.713704 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.713666 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 13:20:52.385902479 +0000 UTC Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.727223 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.736638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.736671 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.736684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.736699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.736709 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.741506 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.764465 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.783281 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.799167 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.828956 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.839089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.839145 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.839163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.839190 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.839208 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.843586 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.847646 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.847686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.847701 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.847718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.847852 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.860947 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.868361 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.872082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.872147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.872161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.872177 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.872189 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.876668 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.886215 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.889884 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.889927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.889939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.889954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.889965 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.892565 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.904147 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.911730 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.916602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.916632 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.916647 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.916666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.916685 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.917688 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.930198 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.934759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.934836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.934850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.934866 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.934903 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.936007 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.954803 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.954992 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.956939 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.957269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.957297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.957311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.957330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.957345 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.968878 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.983609 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.996381 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.007588 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.024641 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.060347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.060402 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.060416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.060475 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.060488 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.162763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.162799 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.162807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.162821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.162830 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.264929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.264977 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.264990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.265007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.265019 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.367116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.367159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.367171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.367185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.367194 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.471024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.471086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.471109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.471137 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.471158 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.573960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.574019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.574043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.574073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.574097 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.676613 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.676688 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.676700 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.676722 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.676764 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.714406 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 16:58:39.123852364 +0000 UTC Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.743995 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:12 crc kubenswrapper[4775]: E0127 11:21:12.744129 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.744158 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.744213 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.744022 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:12 crc kubenswrapper[4775]: E0127 11:21:12.744582 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:12 crc kubenswrapper[4775]: E0127 11:21:12.744681 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:12 crc kubenswrapper[4775]: E0127 11:21:12.744708 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.744867 4775 scope.go:117] "RemoveContainer" containerID="437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.763864 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.779177 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.779594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.779812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.780023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.780257 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.790676 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.814001 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.834287 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.849616 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.863268 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.881996 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.883178 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.883211 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.883222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.883239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.883251 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.905522 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.926279 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.940132 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.952706 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.974274 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.986567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.986598 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.986606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.986619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.986629 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.989023 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.005684 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.022285 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.033590 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.050527 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.080725 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/1.log" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.083242 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.083889 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.084020 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.089039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.089327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.089337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.089349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.089361 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.108418 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.121037 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.145293 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.176677 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.191066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.191112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.191128 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.191146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.191157 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.194527 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.206509 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.225424 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.240482 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.261793 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.278621 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.293525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.293574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.293588 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.293605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.293616 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.303312 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.324547 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.337656 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.350696 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.367146 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.379636 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.391346 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.395750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.395779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.395790 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.395805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.395817 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.497931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.497962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.497971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.497989 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.497999 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.600080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.600150 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.600175 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.600204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.600228 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.702416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.702514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.702533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.702556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.702576 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.715195 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 16:32:51.731319433 +0000 UTC Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.806518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.806624 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.806666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.806698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.806717 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.807670 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.910326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.910395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.910415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.910440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.910485 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.013067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.013128 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.013145 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.013173 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.013192 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.090239 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/2.log" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.091428 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/1.log" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.095191 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.096217 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" exitCode=1 Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.096304 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.096354 4775 scope.go:117] "RemoveContainer" containerID="437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.097831 4775 scope.go:117] "RemoveContainer" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.098167 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.114702 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.116193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.116252 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.116275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.116303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.116325 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.134000 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.148147 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.167100 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.189624 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.208266 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.219257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.219303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.219317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.219339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.219353 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.227801 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.247390 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.274548 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.291620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.313683 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.322981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.323021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.323030 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.323044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.323055 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.335179 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.353648 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.368190 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.383806 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.399865 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.416787 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.426068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.426391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.426662 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.426877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.427064 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.529834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.530082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.530165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.530249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.530344 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.633325 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.633373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.633385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.633403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.633414 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.715902 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 14:15:52.257288807 +0000 UTC Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.736101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.736151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.736169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.736193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.736212 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.744329 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.744366 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.744483 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.744501 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.744797 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.745037 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.745073 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.745377 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.839420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.839544 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.839572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.839605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.839632 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.894888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.895697 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.895856 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:30.895809147 +0000 UTC m=+70.037406964 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.942712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.942823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.942845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.942875 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.942898 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.046064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.046118 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.046135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.046294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.046373 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.102419 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/2.log" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.106311 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.108571 4775 scope.go:117] "RemoveContainer" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" Jan 27 11:21:15 crc kubenswrapper[4775]: E0127 11:21:15.108794 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.127386 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.143219 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.148798 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.148833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.148856 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.148871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.148881 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.157239 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.173738 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.186264 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.202744 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.219551 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.230954 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.244778 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.251186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.251265 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.251288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.251319 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.251342 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.256744 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.272926 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.289947 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.305421 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.318173 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.330267 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.343870 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.353478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.353514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.353524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.353540 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.353550 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.359302 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.455896 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.455930 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.455938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.455953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.455963 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.558402 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.558473 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.558490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.558548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.558568 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.661134 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.661161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.661169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.661183 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.661193 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.716226 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 15:35:52.704164601 +0000 UTC Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.764027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.764068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.764080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.764101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.764113 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.900084 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.900126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.900139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.900171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.900184 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.003899 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.003953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.003965 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.003985 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.003999 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.107761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.107793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.107804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.107820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.107833 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.210963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.211023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.211043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.211067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.211082 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.314296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.314332 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.314342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.314356 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.314369 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.416792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.416835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.416843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.416857 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.416868 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.517040 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.517149 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.517216 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.517443 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.517609 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:21:48.517568252 +0000 UTC m=+87.659166089 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.517665 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:48.517646894 +0000 UTC m=+87.659244761 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.517790 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.517848 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:48.517833968 +0000 UTC m=+87.659431745 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.519251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.519318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.519339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.519368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.519389 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.618065 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.618241 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618412 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618416 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618541 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618566 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618508 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618652 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618692 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:48.618663252 +0000 UTC m=+87.760261069 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618798 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:48.618779025 +0000 UTC m=+87.760376852 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.621679 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.621715 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.621732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.621754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.621771 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.717102 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 07:30:55.763514847 +0000 UTC Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.724019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.724060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.724184 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.724208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.724297 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.744882 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.744924 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.744882 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.745001 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.744924 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.745141 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.745266 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.745410 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.826915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.826978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.826996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.827017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.827032 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.929722 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.929794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.929812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.929846 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.929869 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.032342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.032382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.032391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.032405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.032421 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.134924 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.134969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.134978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.134992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.135001 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.238098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.238151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.238169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.238186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.238196 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.341118 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.341160 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.341169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.341183 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.341193 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.444283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.444343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.444360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.444385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.444401 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.547510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.547555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.547563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.547579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.547592 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.650583 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.650627 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.650643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.650661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.650673 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.717379 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 23:03:34.50270977 +0000 UTC Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.752433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.752478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.752487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.752498 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.752507 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.855217 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.855261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.855272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.855288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.855299 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.957618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.957681 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.957699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.957716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.957733 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.059953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.059990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.059998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.060010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.060020 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.162761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.162822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.162832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.162844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.162853 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.265374 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.265605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.265684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.265766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.265853 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.368705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.368769 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.368787 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.368811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.368828 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.471421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.471510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.471528 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.471554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.471571 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.574723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.574773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.574789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.574811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.574829 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.678196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.678299 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.678318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.678341 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.678359 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.718005 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 11:49:15.387990095 +0000 UTC Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.744736 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.744765 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.744765 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.744891 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:18 crc kubenswrapper[4775]: E0127 11:21:18.745104 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:18 crc kubenswrapper[4775]: E0127 11:21:18.745302 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:18 crc kubenswrapper[4775]: E0127 11:21:18.745418 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:18 crc kubenswrapper[4775]: E0127 11:21:18.745534 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.781901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.781960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.781978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.782001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.782018 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.885164 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.885227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.885249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.885271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.885287 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.988685 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.988761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.988780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.988810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.988830 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.092246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.092306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.092324 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.092347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.092367 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.195248 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.195384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.195405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.195428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.195473 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.297994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.298038 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.298048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.298066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.298076 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.401027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.401066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.401074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.401093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.401102 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.503603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.503652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.503662 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.503675 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.503685 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.607078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.607134 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.607151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.607177 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.607193 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.710075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.710126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.710139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.710156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.710168 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.718653 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 10:26:13.43631159 +0000 UTC Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.813723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.813777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.813807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.813829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.813844 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.917291 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.917355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.917373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.917397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.917414 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.020381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.020541 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.020561 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.020592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.020623 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.124109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.124157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.124169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.124185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.124198 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.227826 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.227922 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.227938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.227963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.227979 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.331386 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.331488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.331517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.331547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.331568 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.434732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.434800 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.434823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.434852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.434872 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.538561 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.538619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.538631 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.538657 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.538670 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.645521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.645613 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.645641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.645676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.645762 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.719378 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 02:35:39.948251203 +0000 UTC Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.744745 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.744930 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.744983 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:20 crc kubenswrapper[4775]: E0127 11:21:20.745226 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.745289 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:20 crc kubenswrapper[4775]: E0127 11:21:20.745536 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:20 crc kubenswrapper[4775]: E0127 11:21:20.746054 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:20 crc kubenswrapper[4775]: E0127 11:21:20.746334 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.749063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.749117 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.749135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.749157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.749176 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.852053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.852143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.852170 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.852204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.852229 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.955311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.955370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.955390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.955427 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.955476 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.058327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.058381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.058399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.058421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.058439 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.161246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.161320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.161337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.161363 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.161382 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.264035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.264091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.264108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.264131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.264150 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.366924 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.366995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.367020 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.367053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.367080 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.470194 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.470234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.470245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.470262 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.470274 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.573956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.574082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.574110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.574141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.574165 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.677876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.677954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.677978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.678013 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.678057 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.719650 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 17:46:36.605535352 +0000 UTC Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.767705 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.781401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.781465 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.781477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.781494 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.781505 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.798269 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.820359 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.838009 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.861374 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.882358 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.884363 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.884410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.884424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.884474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.884493 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.901056 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.916284 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.930878 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.944347 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.959606 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.976320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.987963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.987996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.988009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.988026 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.988037 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.994196 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.013812 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.042663 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.055620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.068783 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.091059 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.091094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.091131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.091154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.091172 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.113007 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.130751 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.144495 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.158379 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.177300 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.192160 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.193331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.193375 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.193388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.193410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.193427 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.206162 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.209389 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.209432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.209474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.209496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.209511 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.225791 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.235891 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.241438 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.241555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.241581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.241611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.241633 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.248551 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.260833 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.263780 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.266105 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.266167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.266188 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.266218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.266239 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.281343 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.282208 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.286898 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.286935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.286944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.286956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.286964 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.303443 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.307588 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.311948 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.311973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.311982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.311994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.312002 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.319950 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.328812 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.328954 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.330947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.330980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.330993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.331010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.331023 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.331665 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.344260 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.358191 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.374548 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.384399 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.433268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.433309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.433321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.433337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.433348 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.535526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.535578 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.535589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.535606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.535938 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.639114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.639179 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.639197 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.639220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.639240 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.720711 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 15:05:43.439728348 +0000 UTC Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.742572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.742614 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.742625 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.742638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.742647 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.744934 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.745076 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.745084 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.745251 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.745259 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.745375 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.745507 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.745625 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.846114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.846187 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.846207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.846234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.846253 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.949258 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.949325 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.949342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.949366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.949384 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.053362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.053440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.053548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.053583 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.053606 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.156928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.157483 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.157850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.158151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.158731 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.262202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.262271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.262289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.262314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.262333 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.365782 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.365847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.365883 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.365914 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.365936 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.468396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.468434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.468444 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.468483 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.468495 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.571571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.571619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.571636 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.571659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.571675 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.675025 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.675086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.675109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.675140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.675161 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.721125 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 16:35:40.50168365 +0000 UTC Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.777823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.777918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.777936 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.777960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.777978 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.880212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.880261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.880273 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.880293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.880306 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.982296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.982391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.982410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.982437 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.982492 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.084544 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.084599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.084620 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.084644 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.084665 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.187174 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.187249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.187266 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.187289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.187306 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.291735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.291808 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.291830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.291861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.291884 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.394504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.394545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.394602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.394617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.394628 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.497317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.497355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.497365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.497380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.497391 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.599775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.599812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.599825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.599841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.599852 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.702750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.702809 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.702832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.702856 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.702875 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.721861 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 03:42:45.84616998 +0000 UTC Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.744442 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:24 crc kubenswrapper[4775]: E0127 11:21:24.744725 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.745037 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:24 crc kubenswrapper[4775]: E0127 11:21:24.745165 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.745711 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:24 crc kubenswrapper[4775]: E0127 11:21:24.745845 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.745953 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:24 crc kubenswrapper[4775]: E0127 11:21:24.746083 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.806001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.806069 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.806087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.806119 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.806138 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.909042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.909086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.909097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.909114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.909127 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.012407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.012490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.012512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.012544 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.012566 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.115072 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.115141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.115152 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.115167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.115177 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.217182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.217221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.217231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.217253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.217262 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.319943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.319998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.320017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.320041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.320061 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.423210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.423255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.423264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.423284 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.423294 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.526045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.526082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.526090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.526104 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.526113 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.628262 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.628298 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.628309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.628327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.628341 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.722246 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 02:30:41.484704045 +0000 UTC Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.731189 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.731219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.731229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.731245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.731257 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.833545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.833585 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.833596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.833610 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.833622 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.967004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.967261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.967327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.967407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.967480 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.070176 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.070568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.070736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.070865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.071005 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.172828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.172853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.172861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.172874 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.172883 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.275031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.275078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.275096 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.275118 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.275135 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.378141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.378190 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.378203 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.378222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.378275 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.481297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.481351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.481368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.481390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.481407 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.582968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.582996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.583004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.583015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.583024 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.685508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.685574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.685594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.685621 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.685638 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.723259 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 16:20:44.641596518 +0000 UTC Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.744677 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.744779 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.744801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:26 crc kubenswrapper[4775]: E0127 11:21:26.744836 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:26 crc kubenswrapper[4775]: E0127 11:21:26.744961 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:26 crc kubenswrapper[4775]: E0127 11:21:26.745129 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.745156 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:26 crc kubenswrapper[4775]: E0127 11:21:26.745765 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.746156 4775 scope.go:117] "RemoveContainer" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" Jan 27 11:21:26 crc kubenswrapper[4775]: E0127 11:21:26.746399 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.788390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.788471 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.788491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.788512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.788528 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.892545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.892653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.892676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.892741 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.892759 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.996379 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.996520 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.996538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.996559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.996607 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.098979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.099046 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.099058 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.099075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.099087 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.201376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.201649 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.201739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.201829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.201924 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.304371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.304609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.304686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.304772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.304853 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.407836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.407878 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.407888 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.407904 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.407918 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.510003 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.510044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.510052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.510068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.510077 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.612180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.612221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.612231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.612248 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.612258 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.713941 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.713982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.713990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.714004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.714014 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.724241 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 10:30:10.365808804 +0000 UTC Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.816393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.816492 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.816511 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.816536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.816553 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.918980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.919035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.919045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.919060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.919070 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.021005 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.021057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.021074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.021097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.021118 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.126401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.126500 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.126525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.126554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.126578 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.229555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.229600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.229611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.229627 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.229642 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.331913 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.331973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.331986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.332021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.332034 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.434589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.434805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.435009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.435206 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.435377 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.538007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.538076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.538099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.538127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.538148 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.640401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.640532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.640603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.640633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.640694 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.725291 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 06:11:26.865058265 +0000 UTC Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.743853 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.743907 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.743940 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.743862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.744011 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.744033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.744047 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.744059 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: E0127 11:21:28.744001 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.744079 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: E0127 11:21:28.744128 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:28 crc kubenswrapper[4775]: E0127 11:21:28.744280 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:28 crc kubenswrapper[4775]: E0127 11:21:28.744405 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.846825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.846879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.846887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.846901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.846909 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.949568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.949594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.949602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.949615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.949623 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.052090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.052132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.052142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.052156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.052167 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.154881 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.154937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.154948 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.154966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.154978 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.257280 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.257334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.257346 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.257362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.257373 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.359998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.360035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.360043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.360058 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.360068 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.462312 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.462350 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.462362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.462380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.462394 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.564556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.564587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.564598 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.564614 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.564625 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.667115 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.667202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.667218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.667237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.667250 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.725903 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 00:17:21.463243599 +0000 UTC Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.769845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.769886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.769897 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.769913 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.769923 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.872671 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.872705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.872717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.872736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.872748 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.974832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.974865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.974873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.974886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.974896 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.077320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.077361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.077373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.077390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.077404 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.179376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.179414 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.179425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.179440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.179465 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.281514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.281560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.281589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.281606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.281617 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.384056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.384106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.384123 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.384144 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.384163 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.486021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.486049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.486057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.486069 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.486078 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.588202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.588235 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.588243 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.588257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.588265 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.690824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.690859 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.690867 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.690881 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.690894 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.726364 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 02:02:28.127037361 +0000 UTC Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.744394 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.744436 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.744436 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.744513 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.744627 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.744639 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.744654 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.744708 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.793439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.793505 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.793518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.793536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.793552 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.896034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.896135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.896143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.896158 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.896167 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.906545 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.906709 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.906809 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:22:02.906778784 +0000 UTC m=+102.048376591 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.016009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.016048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.016057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.016072 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.016081 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.118222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.118280 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.118296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.118317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.118333 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.220782 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.220842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.220865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.220891 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.220913 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.323186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.323213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.323222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.323235 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.323244 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.425718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.425776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.425797 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.425823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.425844 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.528399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.528493 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.528516 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.528543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.528565 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.630880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.630919 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.630933 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.630951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.630965 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.726835 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 16:24:59.347517571 +0000 UTC Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.733175 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.733225 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.733247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.733276 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.733297 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.757824 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.767122 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.776315 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.786561 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.798500 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.810249 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.822717 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836523 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836533 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836959 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.865342 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.886114 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.905816 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.920590 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.931378 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.938655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.938693 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.938705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.938723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.938737 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.940860 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.952906 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.964511 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.973299 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.041484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.041518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.041528 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.041543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.041554 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.143545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.143579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.143587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.143600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.143610 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.245774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.245805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.245813 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.245825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.245833 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.348196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.348240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.348251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.348268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.348279 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.450970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.451010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.451020 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.451033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.451043 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.553801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.553849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.553860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.553876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.553887 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.606214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.606249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.606258 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.606272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.606282 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.621630 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:32Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.629982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.630024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.630041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.630064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.630084 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.651275 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:32Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.655399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.655546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.655575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.655606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.655642 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.677202 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:32Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.680653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.680704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.680716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.680730 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.680742 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.692042 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:32Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.695383 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.695443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.695480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.695501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.695514 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.707866 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:32Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.708015 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.710017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.710233 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.710242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.710256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.710266 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.727318 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 19:27:17.838835674 +0000 UTC Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.743883 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.743899 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.744013 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.744007 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.744062 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.744161 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.744381 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.744583 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.812288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.812321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.812333 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.812348 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.812360 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.914660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.914745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.914777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.914805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.914826 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.017002 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.017037 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.017050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.017067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.017078 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.119497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.119529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.119538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.119550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.119559 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.222102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.222331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.222474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.222590 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.222682 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.325486 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.325529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.325543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.325558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.325569 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.428275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.428319 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.428330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.428347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.428360 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.531159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.531227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.531237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.531277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.531293 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.633220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.633254 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.633267 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.633281 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.633292 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.727921 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 16:40:24.523978253 +0000 UTC Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.735760 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.735885 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.735950 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.736010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.736066 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.838643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.838683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.838691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.838706 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.838715 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.941288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.941592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.941658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.941743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.941801 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.044223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.044253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.044264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.044279 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.044287 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.146387 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.146419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.146427 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.146439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.146471 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.172627 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/0.log" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.172666 4775 generic.go:334] "Generic (PLEG): container finished" podID="aba2edc6-0e64-4995-830d-e177919ea13e" containerID="e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc" exitCode=1 Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.172690 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerDied","Data":"e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.172994 4775 scope.go:117] "RemoveContainer" containerID="e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.186886 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.197503 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.207423 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.220920 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.234922 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.248373 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.249947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.250050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.250128 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.250213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.250287 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.263386 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.275345 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.288612 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.298305 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.310120 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.324434 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.342206 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352006 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352534 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352549 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352559 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.361912 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.374582 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.387854 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.454947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.454997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.455011 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.455028 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.455042 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.557524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.557571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.557582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.557601 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.557613 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.659705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.659746 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.659754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.659768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.659779 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.728093 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 21:00:23.272324447 +0000 UTC Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.744020 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.744036 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.744101 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.744157 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:34 crc kubenswrapper[4775]: E0127 11:21:34.744302 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:34 crc kubenswrapper[4775]: E0127 11:21:34.744554 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:34 crc kubenswrapper[4775]: E0127 11:21:34.744717 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:34 crc kubenswrapper[4775]: E0127 11:21:34.744869 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.761833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.761863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.761871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.761884 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.761893 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.863912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.863935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.863946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.863956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.863964 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.966652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.966956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.967153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.967310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.967482 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.070064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.070110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.070123 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.070140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.070150 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.172993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.173036 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.173049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.173068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.173079 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.177090 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/0.log" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.177311 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerStarted","Data":"750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.192840 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.209772 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.224764 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.238749 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.255536 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.268927 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.275212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.275257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.275268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.275285 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.275298 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.281779 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.291705 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.305238 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.315615 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.325648 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.338523 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.351089 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.366727 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.378671 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.378720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.378732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.378752 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.378767 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.382821 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.400906 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.425818 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.480696 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.480739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.480749 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.480763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.480772 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.582947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.582986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.582994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.583007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.583016 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.685312 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.685345 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.685353 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.685364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.685373 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.729790 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 01:59:50.039709723 +0000 UTC Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.787854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.787889 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.787901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.787917 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.787930 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.890753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.890821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.890845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.890936 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.890966 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.993968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.994021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.994029 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.994043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.994054 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.096097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.096133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.096141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.096155 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.096166 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.198169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.198227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.198247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.198264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.198275 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.300858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.300904 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.300916 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.300934 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.300946 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.403851 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.403896 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.403908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.403928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.403941 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.507791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.507847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.507863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.507887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.507904 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.610315 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.610414 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.610435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.610489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.610508 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.712999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.713048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.713059 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.713076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.713091 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.730592 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 07:40:57.454856903 +0000 UTC Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.744915 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.744928 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:36 crc kubenswrapper[4775]: E0127 11:21:36.745251 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.744974 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.744962 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:36 crc kubenswrapper[4775]: E0127 11:21:36.745325 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:36 crc kubenswrapper[4775]: E0127 11:21:36.745265 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:36 crc kubenswrapper[4775]: E0127 11:21:36.745505 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.815016 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.815063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.815078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.815099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.815112 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.917788 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.918070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.918158 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.918253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.918348 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.020784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.020832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.020844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.020885 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.020899 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.123338 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.123385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.123396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.123413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.123427 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.225531 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.225580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.225596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.225618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.225635 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.332180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.332238 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.332269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.332296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.332314 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.435659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.435732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.435751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.435776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.435794 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.538520 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.538605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.538629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.538655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.538674 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.641171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.641261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.641278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.641301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.641319 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.731630 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 18:55:35.433161811 +0000 UTC Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.744142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.744199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.744216 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.744239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.744257 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.846882 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.846949 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.846969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.846994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.847012 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.949757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.949793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.949801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.949814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.949823 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.052653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.052695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.052704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.052717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.052729 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.155228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.155265 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.155272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.155284 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.155292 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.257488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.257570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.257586 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.257604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.257617 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.360314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.360372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.360393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.360417 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.360434 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.463293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.463347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.463360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.463375 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.463386 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.565832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.565887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.565903 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.565926 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.565943 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.668428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.668480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.668491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.668506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.668520 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.732339 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 11:45:54.639144091 +0000 UTC Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.744949 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.744988 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:38 crc kubenswrapper[4775]: E0127 11:21:38.745167 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.745190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.745242 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:38 crc kubenswrapper[4775]: E0127 11:21:38.745376 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:38 crc kubenswrapper[4775]: E0127 11:21:38.745534 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:38 crc kubenswrapper[4775]: E0127 11:21:38.745681 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.746390 4775 scope.go:117] "RemoveContainer" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.771993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.772048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.772075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.772108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.772130 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.877702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.877757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.877780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.877813 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.877873 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.980291 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.980350 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.980367 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.980391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.980445 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.090218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.090257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.090269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.090286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.090296 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.192127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.192205 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.192229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.192277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.192300 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.193007 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/2.log" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.196647 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.197701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.198204 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.212871 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.229356 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.253532 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.269031 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.283068 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.299512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.299553 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.299570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.299587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.299600 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.302882 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.320155 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.333159 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.346952 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.358763 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.372480 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.386141 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.401344 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.401384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.401395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.401409 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.401422 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.407467 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.426215 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.441661 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.461789 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.471166 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.503722 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.503776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.503791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.503816 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.503832 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.605892 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.605932 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.605943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.605960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.605973 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.709043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.709093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.709111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.709131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.709145 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.732753 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 18:09:30.967697001 +0000 UTC Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.811762 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.811809 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.811818 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.811835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.811845 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.914687 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.914750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.914767 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.914790 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.914807 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.017739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.017789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.017807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.017831 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.017847 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.121098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.121168 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.121191 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.121222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.121243 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.211607 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/3.log" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.212621 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/2.log" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.216035 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.217615 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" exitCode=1 Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.217678 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.217774 4775 scope.go:117] "RemoveContainer" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.219415 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:21:40 crc kubenswrapper[4775]: E0127 11:21:40.219916 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.223278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.223335 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.223349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.223370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.223385 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.242936 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.265662 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.288481 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.318738 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:39Z\\\",\\\"message\\\":\\\"ons: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:39.698903 6850 services_controller.go:454] Service openshift-controller-manager/controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0127 11:21:39.698995 6850 services_controller.go:434] Service openshift-route-controller-manager/route-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{route-controller-manager openshift-route-controller-manager 754a1504-193a-42d9-b250-5d40bcccc281 4720 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:route-controller-manager] map[operator.openshift.io/spec-hash:a480352ea60c2dcd2b3870bf0c3650528ef9b51aaa3fe6baa1e3711da18fffa3 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.325958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.326060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.326079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.326106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.326123 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.336320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.356639 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.376930 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.395571 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.415212 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.430688 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.430748 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.430766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.430789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.430805 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.431780 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.451694 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.475429 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.492184 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.517117 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.533557 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.533615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.533631 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.533651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.533669 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.537728 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.555387 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.576229 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.636887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.636953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.636970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.636997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.637017 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.733190 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 11:02:39.325860329 +0000 UTC Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.739675 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.739739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.739757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.739780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.739797 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.744298 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.744383 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.744408 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.744496 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:40 crc kubenswrapper[4775]: E0127 11:21:40.744564 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:40 crc kubenswrapper[4775]: E0127 11:21:40.744626 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:40 crc kubenswrapper[4775]: E0127 11:21:40.744808 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:40 crc kubenswrapper[4775]: E0127 11:21:40.745028 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.844144 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.845107 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.845126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.845147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.845159 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.947552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.947604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.947618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.947638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.947654 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.050508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.050561 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.050575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.050593 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.050606 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.153371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.153493 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.153513 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.153541 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.153558 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.227347 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/3.log" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.230065 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.232178 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:21:41 crc kubenswrapper[4775]: E0127 11:21:41.232521 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.247717 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.256102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.256167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.256184 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.256208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.256227 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.263634 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.281986 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.301928 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.319733 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.341404 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.358352 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.358424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.358483 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.358529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.358553 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.370839 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:39Z\\\",\\\"message\\\":\\\"ons: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:39.698903 6850 services_controller.go:454] Service openshift-controller-manager/controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0127 11:21:39.698995 6850 services_controller.go:434] Service openshift-route-controller-manager/route-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{route-controller-manager openshift-route-controller-manager 754a1504-193a-42d9-b250-5d40bcccc281 4720 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:route-controller-manager] map[operator.openshift.io/spec-hash:a480352ea60c2dcd2b3870bf0c3650528ef9b51aaa3fe6baa1e3711da18fffa3 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.383952 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.403766 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.420099 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.432688 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.447283 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461669 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461914 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461924 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461952 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.477136 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.492014 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.507043 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.517291 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.563764 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.563810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.563822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.563839 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.563852 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.666750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.666795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.667077 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.667105 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.667119 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.733832 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 08:00:24.198113694 +0000 UTC Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.762256 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.769643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.769668 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.769676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.769690 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.769700 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.774868 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.788253 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.810425 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.840504 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:39Z\\\",\\\"message\\\":\\\"ons: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:39.698903 6850 services_controller.go:454] Service openshift-controller-manager/controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0127 11:21:39.698995 6850 services_controller.go:434] Service openshift-route-controller-manager/route-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{route-controller-manager openshift-route-controller-manager 754a1504-193a-42d9-b250-5d40bcccc281 4720 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:route-controller-manager] map[operator.openshift.io/spec-hash:a480352ea60c2dcd2b3870bf0c3650528ef9b51aaa3fe6baa1e3711da18fffa3 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.859537 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.872231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.872285 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.872304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.872330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.872348 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.874258 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.892965 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.907773 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.925552 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.938308 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.952021 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.967032 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.975132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.975180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.975189 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.975203 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.975212 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.983250 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.996674 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.011694 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:42Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.025162 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:42Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.079085 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.079165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.079192 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.079223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.079244 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.181766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.181828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.181845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.181869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.181885 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.284862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.284960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.284979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.285004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.285024 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.388162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.388223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.388237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.388257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.388272 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.491397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.491478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.491495 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.491517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.491535 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.595950 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.596008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.596027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.596050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.596068 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.698472 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.698499 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.698508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.698520 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.698530 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.734895 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 07:07:48.736426403 +0000 UTC Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.744288 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.744368 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.744383 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:42 crc kubenswrapper[4775]: E0127 11:21:42.744535 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.744583 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:42 crc kubenswrapper[4775]: E0127 11:21:42.744684 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:42 crc kubenswrapper[4775]: E0127 11:21:42.744944 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:42 crc kubenswrapper[4775]: E0127 11:21:42.745016 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.803496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.803582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.803632 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.803681 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.803707 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.906395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.906489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.906510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.906536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.906554 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.973196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.973269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.973293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.973324 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.973347 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: E0127 11:21:42.988376 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:42Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.993843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.993909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.993927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.993952 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.993969 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: E0127 11:21:43.014981 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:43Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.018744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.018791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.018804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.018825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.018839 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: E0127 11:21:43.036678 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:43Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.041298 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.041351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.041370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.041394 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.041412 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: E0127 11:21:43.060576 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:43Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.065554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.065624 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.065651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.065680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.065702 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: E0127 11:21:43.085184 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:43Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:43 crc kubenswrapper[4775]: E0127 11:21:43.085407 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.088169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.088221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.088242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.088435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.088525 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.191825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.191864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.191873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.191887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.191897 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.294554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.294625 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.294647 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.294677 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.294696 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.397399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.397445 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.397474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.397490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.397501 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.500041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.500097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.500116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.500141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.500161 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.603377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.603478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.603490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.603505 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.603516 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.706751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.706803 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.706814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.706829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.706839 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.735172 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 06:14:40.153876792 +0000 UTC Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.810071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.810599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.810666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.810699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.810721 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.913473 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.913602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.913621 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.913648 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.913665 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.016865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.016938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.016964 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.016993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.017019 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.119489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.119523 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.119536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.119579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.119593 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.221578 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.221612 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.221623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.221637 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.221645 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.323750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.323807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.323824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.323845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.323859 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.426817 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.426871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.426888 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.426912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.426928 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.529765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.529937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.529962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.529984 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.530003 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.632914 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.633010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.633027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.633049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.633065 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.736183 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 00:06:03.070524531 +0000 UTC Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.739571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.739661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.739681 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.739707 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.739742 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.743916 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:44 crc kubenswrapper[4775]: E0127 11:21:44.744098 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.744370 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:44 crc kubenswrapper[4775]: E0127 11:21:44.744509 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.744719 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:44 crc kubenswrapper[4775]: E0127 11:21:44.744829 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.745053 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:44 crc kubenswrapper[4775]: E0127 11:21:44.745152 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.842943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.843006 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.843027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.843056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.843078 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.946710 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.946789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.946814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.946842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.946862 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.049807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.049867 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.049886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.049908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.049925 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.152833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.152891 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.152909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.152938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.152960 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.255996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.256108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.256128 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.256153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.256171 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.359367 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.359476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.359496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.359527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.359551 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.462442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.462571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.462599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.462628 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.462649 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.564519 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.564565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.564580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.564596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.564606 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.668033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.668094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.668112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.668139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.668158 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.736783 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 15:58:02.548296856 +0000 UTC Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.771076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.771571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.771595 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.771620 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.771638 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.873801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.873830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.873838 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.873852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.873860 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.976436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.976514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.976530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.976552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.976570 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.079795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.079843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.079855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.079873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.079886 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.182548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.182608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.182626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.182652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.182672 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.285424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.285508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.285526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.285548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.285565 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.388421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.388503 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.388522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.388547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.388565 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.491526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.491596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.491614 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.491639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.491661 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.594911 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.595001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.595052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.595079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.595097 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.698376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.698416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.698428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.698442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.698468 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.737093 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 02:39:44.434089731 +0000 UTC Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.744531 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.744571 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.744605 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:46 crc kubenswrapper[4775]: E0127 11:21:46.744757 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.744844 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:46 crc kubenswrapper[4775]: E0127 11:21:46.744968 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:46 crc kubenswrapper[4775]: E0127 11:21:46.745140 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:46 crc kubenswrapper[4775]: E0127 11:21:46.745383 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.801220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.801292 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.801310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.801334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.801352 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.904522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.904580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.904600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.904625 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.904644 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.007360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.007400 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.007411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.007430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.007478 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.111039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.111117 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.111137 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.111163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.111183 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.213970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.214060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.214079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.214103 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.214121 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.317351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.317412 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.317430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.317480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.317497 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.420833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.420891 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.420908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.420931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.420950 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.524429 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.524543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.524564 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.524587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.524605 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.629025 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.629090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.629109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.629132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.629149 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.732680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.732750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.732771 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.732794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.732811 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.738013 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 16:17:09.478867748 +0000 UTC Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.835731 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.835810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.835835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.835870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.835896 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.938407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.938507 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.938532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.938556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.938573 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.041181 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.041240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.041252 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.041278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.041306 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.144431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.144531 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.144554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.144584 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.144605 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.248169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.248247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.248275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.248305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.248329 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.351985 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.352035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.352049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.352067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.352079 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.455342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.455393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.455405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.455422 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.455436 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.558491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.558547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.558560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.558575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.558588 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.587362 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.587542 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.587603 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.587574508 +0000 UTC m=+151.729172305 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.587679 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.587726 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.587735 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.587717783 +0000 UTC m=+151.729315570 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.587893 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.587969 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.587956769 +0000 UTC m=+151.729554556 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.661319 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.661369 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.661381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.661399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.661412 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.689268 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.689327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689525 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689550 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689567 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689634 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.689617573 +0000 UTC m=+151.831215360 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689644 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689691 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689712 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689803 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.689777548 +0000 UTC m=+151.831375355 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.738535 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 16:00:09.400367687 +0000 UTC Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.744871 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.744969 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.744995 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.745261 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.745265 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.745400 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.745521 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.745682 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.764290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.764327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.764340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.764356 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.764368 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.868283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.868361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.868384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.868416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.868441 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.971041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.971150 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.971204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.971228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.971248 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.074062 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.074122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.074139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.074168 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.074184 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.177560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.177622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.177642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.177666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.177687 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.280172 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.280223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.280233 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.280251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.280262 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.384084 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.384201 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.384230 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.384262 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.384291 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.487738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.487796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.487812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.487834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.487851 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.591751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.591850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.591871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.591893 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.591910 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.695081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.695160 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.695183 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.695209 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.695226 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.756009 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 23:02:26.954352502 +0000 UTC Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.798404 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.798517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.798537 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.798561 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.798580 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.902398 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.902491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.902510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.902534 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.902552 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.004946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.005049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.005062 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.005075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.005084 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.107697 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.107742 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.107783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.107806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.107822 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.210242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.210294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.210310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.210331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.210349 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.313683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.313821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.313849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.313880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.313902 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.416660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.416925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.416938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.416958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.416974 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.520132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.520180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.520191 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.520208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.520220 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.623808 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.623869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.623886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.623909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.623927 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.726939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.726988 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.727000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.727019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.727030 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.744778 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.744832 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.744779 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:50 crc kubenswrapper[4775]: E0127 11:21:50.744900 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.744868 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:50 crc kubenswrapper[4775]: E0127 11:21:50.745002 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:50 crc kubenswrapper[4775]: E0127 11:21:50.745149 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:50 crc kubenswrapper[4775]: E0127 11:21:50.745348 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.756509 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 22:06:21.739436664 +0000 UTC Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.830208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.830266 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.830286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.830311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.830330 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.933157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.933215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.933235 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.933259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.933277 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.036615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.036682 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.036698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.036726 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.036742 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.139890 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.139945 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.139962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.139986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.140006 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.242642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.242699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.242716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.242739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.242757 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.346382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.346433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.346477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.346501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.346518 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.449641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.449704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.449720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.449748 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.449771 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.553067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.553550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.553784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.554009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.554215 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.657046 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.657634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.657738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.657781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.657822 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.757574 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 07:43:15.416371567 +0000 UTC Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.760349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.760410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.760428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.760479 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.760499 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.768555 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.788611 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.809517 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.832615 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.862825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.862883 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.862907 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.862939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.862962 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.867209 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:39Z\\\",\\\"message\\\":\\\"ons: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:39.698903 6850 services_controller.go:454] Service openshift-controller-manager/controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0127 11:21:39.698995 6850 services_controller.go:434] Service openshift-route-controller-manager/route-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{route-controller-manager openshift-route-controller-manager 754a1504-193a-42d9-b250-5d40bcccc281 4720 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:route-controller-manager] map[operator.openshift.io/spec-hash:a480352ea60c2dcd2b3870bf0c3650528ef9b51aaa3fe6baa1e3711da18fffa3 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.883749 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.899405 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.917871 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.938953 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.958620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.965974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.966019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.966031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.966053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.966063 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.974526 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.991592 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.011178 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.031395 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.049272 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069445 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069673 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069690 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069733 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.086624 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.172777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.172881 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.172901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.172925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.172942 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.277360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.277428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.277446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.277495 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.277513 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.380864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.380982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.381109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.381145 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.381181 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.483529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.483570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.483580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.483594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.483603 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.586228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.586307 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.586330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.586363 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.586388 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.689735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.689896 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.689921 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.689951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.689977 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.744753 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.744826 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.744941 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:52 crc kubenswrapper[4775]: E0127 11:21:52.744942 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.744768 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:52 crc kubenswrapper[4775]: E0127 11:21:52.745202 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:52 crc kubenswrapper[4775]: E0127 11:21:52.745338 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:52 crc kubenswrapper[4775]: E0127 11:21:52.745722 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.758249 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 17:43:44.49215773 +0000 UTC Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.793302 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.793355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.793378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.793405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.793429 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.896778 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.896843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.896860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.896886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.896903 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.999529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.999571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.999582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.999599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.999610 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.102904 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.102983 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.103009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.103042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.103061 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:53Z","lastTransitionTime":"2026-01-27T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.206304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.206434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.206484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.206515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.206539 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:53Z","lastTransitionTime":"2026-01-27T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.308716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.308823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.308842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.308909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.308930 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:53Z","lastTransitionTime":"2026-01-27T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.412365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.412413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.412432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.412485 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.412504 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:53Z","lastTransitionTime":"2026-01-27T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.414898 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.414943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.414960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.414985 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.415002 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:53Z","lastTransitionTime":"2026-01-27T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.487045 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l"] Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.487893 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.492588 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.494153 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.494277 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.494848 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.547092 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4114802c-bae6-4711-b8c9-32a0cd83395a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.547141 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4114802c-bae6-4711-b8c9-32a0cd83395a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.547195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4114802c-bae6-4711-b8c9-32a0cd83395a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.547226 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.547284 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.553478 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gm7w4" podStartSLOduration=68.553431296 podStartE2EDuration="1m8.553431296s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.55319672 +0000 UTC m=+92.694794497" watchObservedRunningTime="2026-01-27 11:21:53.553431296 +0000 UTC m=+92.695029083" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.578536 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" podStartSLOduration=68.578514243 podStartE2EDuration="1m8.578514243s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.578312058 +0000 UTC m=+92.719909845" watchObservedRunningTime="2026-01-27 11:21:53.578514243 +0000 UTC m=+92.720112020" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.638018 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9dz9r" podStartSLOduration=68.637991913 podStartE2EDuration="1m8.637991913s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.637938912 +0000 UTC m=+92.779536709" watchObservedRunningTime="2026-01-27 11:21:53.637991913 +0000 UTC m=+92.779589720" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648399 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4114802c-bae6-4711-b8c9-32a0cd83395a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648437 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4114802c-bae6-4711-b8c9-32a0cd83395a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648550 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4114802c-bae6-4711-b8c9-32a0cd83395a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648712 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648772 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.650419 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4114802c-bae6-4711-b8c9-32a0cd83395a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.658609 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4114802c-bae6-4711-b8c9-32a0cd83395a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.678180 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4114802c-bae6-4711-b8c9-32a0cd83395a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.703505 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.703486026 podStartE2EDuration="1m8.703486026s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.690541982 +0000 UTC m=+92.832139849" watchObservedRunningTime="2026-01-27 11:21:53.703486026 +0000 UTC m=+92.845083803" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.731361 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podStartSLOduration=68.73134458 podStartE2EDuration="1m8.73134458s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.730780865 +0000 UTC m=+92.872378662" watchObservedRunningTime="2026-01-27 11:21:53.73134458 +0000 UTC m=+92.872942357" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.745101 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:21:53 crc kubenswrapper[4775]: E0127 11:21:53.745296 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.752692 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.752663604 podStartE2EDuration="42.752663604s" podCreationTimestamp="2026-01-27 11:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.750153246 +0000 UTC m=+92.891751053" watchObservedRunningTime="2026-01-27 11:21:53.752663604 +0000 UTC m=+92.894261421" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.759197 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 08:58:51.329426498 +0000 UTC Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.759264 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.765794 4775 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.797622 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" podStartSLOduration=68.797597575 podStartE2EDuration="1m8.797597575s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.779621073 +0000 UTC m=+92.921218890" watchObservedRunningTime="2026-01-27 11:21:53.797597575 +0000 UTC m=+92.939195392" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.798202 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.798193031 podStartE2EDuration="1m7.798193031s" podCreationTimestamp="2026-01-27 11:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.796991108 +0000 UTC m=+92.938588885" watchObservedRunningTime="2026-01-27 11:21:53.798193031 +0000 UTC m=+92.939790848" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.808730 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: W0127 11:21:53.832593 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4114802c_bae6_4711_b8c9_32a0cd83395a.slice/crio-ba3812dc1073082c6b6e1070cd1bd508b7b14e5f400732f34e50b62a32ee0cc1 WatchSource:0}: Error finding container ba3812dc1073082c6b6e1070cd1bd508b7b14e5f400732f34e50b62a32ee0cc1: Status 404 returned error can't find the container with id ba3812dc1073082c6b6e1070cd1bd508b7b14e5f400732f34e50b62a32ee0cc1 Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.283111 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" event={"ID":"4114802c-bae6-4711-b8c9-32a0cd83395a","Type":"ContainerStarted","Data":"2ead5dfdc7813ef890cc5608717b27e8d92dffc62dd8f7e7bfc20b5227843729"} Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.283184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" event={"ID":"4114802c-bae6-4711-b8c9-32a0cd83395a","Type":"ContainerStarted","Data":"ba3812dc1073082c6b6e1070cd1bd508b7b14e5f400732f34e50b62a32ee0cc1"} Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.303490 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" podStartSLOduration=69.303419791 podStartE2EDuration="1m9.303419791s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:54.303093492 +0000 UTC m=+93.444691309" watchObservedRunningTime="2026-01-27 11:21:54.303419791 +0000 UTC m=+93.445017598" Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.304395 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vxn5f" podStartSLOduration=70.304379436 podStartE2EDuration="1m10.304379436s" podCreationTimestamp="2026-01-27 11:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.853687342 +0000 UTC m=+92.995285169" watchObservedRunningTime="2026-01-27 11:21:54.304379436 +0000 UTC m=+93.445977253" Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.744171 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.744298 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.744593 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:54 crc kubenswrapper[4775]: E0127 11:21:54.744621 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.744683 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:54 crc kubenswrapper[4775]: E0127 11:21:54.744735 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:54 crc kubenswrapper[4775]: E0127 11:21:54.744822 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:54 crc kubenswrapper[4775]: E0127 11:21:54.744474 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:56 crc kubenswrapper[4775]: I0127 11:21:56.744169 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:56 crc kubenswrapper[4775]: E0127 11:21:56.744683 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:56 crc kubenswrapper[4775]: I0127 11:21:56.744248 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:56 crc kubenswrapper[4775]: I0127 11:21:56.744214 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:56 crc kubenswrapper[4775]: E0127 11:21:56.744793 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:56 crc kubenswrapper[4775]: I0127 11:21:56.744297 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:56 crc kubenswrapper[4775]: E0127 11:21:56.744986 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:56 crc kubenswrapper[4775]: E0127 11:21:56.745149 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:56 crc kubenswrapper[4775]: I0127 11:21:56.762605 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 27 11:21:58 crc kubenswrapper[4775]: I0127 11:21:58.744041 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:58 crc kubenswrapper[4775]: I0127 11:21:58.744042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:58 crc kubenswrapper[4775]: I0127 11:21:58.744073 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:58 crc kubenswrapper[4775]: I0127 11:21:58.744189 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:58 crc kubenswrapper[4775]: E0127 11:21:58.744408 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:58 crc kubenswrapper[4775]: E0127 11:21:58.744780 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:58 crc kubenswrapper[4775]: E0127 11:21:58.745005 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:58 crc kubenswrapper[4775]: E0127 11:21:58.745220 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:00 crc kubenswrapper[4775]: I0127 11:22:00.744908 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:00 crc kubenswrapper[4775]: I0127 11:22:00.745015 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:00 crc kubenswrapper[4775]: I0127 11:22:00.744926 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:00 crc kubenswrapper[4775]: E0127 11:22:00.745186 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:00 crc kubenswrapper[4775]: I0127 11:22:00.745597 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:00 crc kubenswrapper[4775]: E0127 11:22:00.745711 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:00 crc kubenswrapper[4775]: E0127 11:22:00.745836 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:00 crc kubenswrapper[4775]: E0127 11:22:00.746109 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:00 crc kubenswrapper[4775]: I0127 11:22:00.769402 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 27 11:22:01 crc kubenswrapper[4775]: I0127 11:22:01.803496 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.803429447 podStartE2EDuration="5.803429447s" podCreationTimestamp="2026-01-27 11:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:01.762769863 +0000 UTC m=+100.904367650" watchObservedRunningTime="2026-01-27 11:22:01.803429447 +0000 UTC m=+100.945027254" Jan 27 11:22:01 crc kubenswrapper[4775]: I0127 11:22:01.805758 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.80574344 podStartE2EDuration="1.80574344s" podCreationTimestamp="2026-01-27 11:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:01.802982605 +0000 UTC m=+100.944580462" watchObservedRunningTime="2026-01-27 11:22:01.80574344 +0000 UTC m=+100.947341257" Jan 27 11:22:02 crc kubenswrapper[4775]: I0127 11:22:02.744961 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:02 crc kubenswrapper[4775]: I0127 11:22:02.745027 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:02 crc kubenswrapper[4775]: I0127 11:22:02.745035 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.745149 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.745241 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:02 crc kubenswrapper[4775]: I0127 11:22:02.745240 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.745661 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.745846 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:02 crc kubenswrapper[4775]: I0127 11:22:02.977629 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.977809 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.977931 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:23:06.977890598 +0000 UTC m=+166.119488415 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:22:04 crc kubenswrapper[4775]: I0127 11:22:04.744640 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:04 crc kubenswrapper[4775]: E0127 11:22:04.744792 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:04 crc kubenswrapper[4775]: I0127 11:22:04.745673 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:04 crc kubenswrapper[4775]: I0127 11:22:04.745750 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:04 crc kubenswrapper[4775]: I0127 11:22:04.745795 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:04 crc kubenswrapper[4775]: E0127 11:22:04.745933 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:04 crc kubenswrapper[4775]: E0127 11:22:04.746323 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:04 crc kubenswrapper[4775]: E0127 11:22:04.746591 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:05 crc kubenswrapper[4775]: I0127 11:22:05.745570 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:22:05 crc kubenswrapper[4775]: E0127 11:22:05.745745 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:22:06 crc kubenswrapper[4775]: I0127 11:22:06.744113 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:06 crc kubenswrapper[4775]: I0127 11:22:06.744183 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:06 crc kubenswrapper[4775]: I0127 11:22:06.744133 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:06 crc kubenswrapper[4775]: E0127 11:22:06.744274 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:06 crc kubenswrapper[4775]: E0127 11:22:06.744570 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:06 crc kubenswrapper[4775]: I0127 11:22:06.744689 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:06 crc kubenswrapper[4775]: E0127 11:22:06.744727 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:06 crc kubenswrapper[4775]: E0127 11:22:06.745220 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:08 crc kubenswrapper[4775]: I0127 11:22:08.744274 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:08 crc kubenswrapper[4775]: I0127 11:22:08.744835 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:08 crc kubenswrapper[4775]: E0127 11:22:08.745049 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:08 crc kubenswrapper[4775]: I0127 11:22:08.745159 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:08 crc kubenswrapper[4775]: I0127 11:22:08.745200 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:08 crc kubenswrapper[4775]: E0127 11:22:08.745759 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:08 crc kubenswrapper[4775]: E0127 11:22:08.746048 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:08 crc kubenswrapper[4775]: E0127 11:22:08.746343 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:10 crc kubenswrapper[4775]: I0127 11:22:10.744629 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:10 crc kubenswrapper[4775]: I0127 11:22:10.744697 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:10 crc kubenswrapper[4775]: I0127 11:22:10.744826 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:10 crc kubenswrapper[4775]: I0127 11:22:10.744892 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:10 crc kubenswrapper[4775]: E0127 11:22:10.744942 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:10 crc kubenswrapper[4775]: E0127 11:22:10.745029 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:10 crc kubenswrapper[4775]: E0127 11:22:10.745135 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:10 crc kubenswrapper[4775]: E0127 11:22:10.745211 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:12 crc kubenswrapper[4775]: I0127 11:22:12.744222 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:12 crc kubenswrapper[4775]: I0127 11:22:12.744268 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:12 crc kubenswrapper[4775]: I0127 11:22:12.744240 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:12 crc kubenswrapper[4775]: E0127 11:22:12.744372 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:12 crc kubenswrapper[4775]: I0127 11:22:12.744434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:12 crc kubenswrapper[4775]: E0127 11:22:12.744609 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:12 crc kubenswrapper[4775]: E0127 11:22:12.744782 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:12 crc kubenswrapper[4775]: E0127 11:22:12.744858 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:14 crc kubenswrapper[4775]: I0127 11:22:14.744484 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:14 crc kubenswrapper[4775]: I0127 11:22:14.744570 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:14 crc kubenswrapper[4775]: I0127 11:22:14.744488 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:14 crc kubenswrapper[4775]: E0127 11:22:14.744664 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:14 crc kubenswrapper[4775]: I0127 11:22:14.744516 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:14 crc kubenswrapper[4775]: E0127 11:22:14.744820 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:14 crc kubenswrapper[4775]: E0127 11:22:14.745015 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:14 crc kubenswrapper[4775]: E0127 11:22:14.745104 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:16 crc kubenswrapper[4775]: I0127 11:22:16.743943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:16 crc kubenswrapper[4775]: I0127 11:22:16.744042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:16 crc kubenswrapper[4775]: I0127 11:22:16.744674 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:16 crc kubenswrapper[4775]: I0127 11:22:16.744685 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:16 crc kubenswrapper[4775]: E0127 11:22:16.744760 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:16 crc kubenswrapper[4775]: E0127 11:22:16.745022 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:16 crc kubenswrapper[4775]: E0127 11:22:16.745121 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:16 crc kubenswrapper[4775]: E0127 11:22:16.745182 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:16 crc kubenswrapper[4775]: I0127 11:22:16.745498 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:22:16 crc kubenswrapper[4775]: E0127 11:22:16.745843 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:22:18 crc kubenswrapper[4775]: I0127 11:22:18.744757 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:18 crc kubenswrapper[4775]: I0127 11:22:18.744829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:18 crc kubenswrapper[4775]: I0127 11:22:18.744876 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:18 crc kubenswrapper[4775]: I0127 11:22:18.744829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:18 crc kubenswrapper[4775]: E0127 11:22:18.744939 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:18 crc kubenswrapper[4775]: E0127 11:22:18.745154 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:18 crc kubenswrapper[4775]: E0127 11:22:18.745196 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:18 crc kubenswrapper[4775]: E0127 11:22:18.745277 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.376532 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/1.log" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.377355 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/0.log" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.377433 4775 generic.go:334] "Generic (PLEG): container finished" podID="aba2edc6-0e64-4995-830d-e177919ea13e" containerID="750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298" exitCode=1 Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.377532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerDied","Data":"750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298"} Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.377623 4775 scope.go:117] "RemoveContainer" containerID="e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.378183 4775 scope.go:117] "RemoveContainer" containerID="750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298" Jan 27 11:22:20 crc kubenswrapper[4775]: E0127 11:22:20.378739 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-gm7w4_openshift-multus(aba2edc6-0e64-4995-830d-e177919ea13e)\"" pod="openshift-multus/multus-gm7w4" podUID="aba2edc6-0e64-4995-830d-e177919ea13e" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.744163 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.744247 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:20 crc kubenswrapper[4775]: E0127 11:22:20.744322 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.744160 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.744259 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:20 crc kubenswrapper[4775]: E0127 11:22:20.744484 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:20 crc kubenswrapper[4775]: E0127 11:22:20.744546 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:20 crc kubenswrapper[4775]: E0127 11:22:20.744620 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:21 crc kubenswrapper[4775]: I0127 11:22:21.394663 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/1.log" Jan 27 11:22:21 crc kubenswrapper[4775]: E0127 11:22:21.691445 4775 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 27 11:22:21 crc kubenswrapper[4775]: E0127 11:22:21.857099 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 11:22:22 crc kubenswrapper[4775]: I0127 11:22:22.744855 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:22 crc kubenswrapper[4775]: I0127 11:22:22.744899 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:22 crc kubenswrapper[4775]: I0127 11:22:22.744875 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:22 crc kubenswrapper[4775]: I0127 11:22:22.745029 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:22 crc kubenswrapper[4775]: E0127 11:22:22.745203 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:22 crc kubenswrapper[4775]: E0127 11:22:22.745418 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:22 crc kubenswrapper[4775]: E0127 11:22:22.745570 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:22 crc kubenswrapper[4775]: E0127 11:22:22.745690 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:24 crc kubenswrapper[4775]: I0127 11:22:24.744687 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:24 crc kubenswrapper[4775]: I0127 11:22:24.744778 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:24 crc kubenswrapper[4775]: I0127 11:22:24.744806 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:24 crc kubenswrapper[4775]: I0127 11:22:24.744991 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:24 crc kubenswrapper[4775]: E0127 11:22:24.744969 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:24 crc kubenswrapper[4775]: E0127 11:22:24.745143 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:24 crc kubenswrapper[4775]: E0127 11:22:24.745362 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:24 crc kubenswrapper[4775]: E0127 11:22:24.745736 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:26 crc kubenswrapper[4775]: I0127 11:22:26.744518 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:26 crc kubenswrapper[4775]: I0127 11:22:26.744563 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:26 crc kubenswrapper[4775]: I0127 11:22:26.744569 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:26 crc kubenswrapper[4775]: E0127 11:22:26.744709 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:26 crc kubenswrapper[4775]: I0127 11:22:26.744753 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:26 crc kubenswrapper[4775]: E0127 11:22:26.744934 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:26 crc kubenswrapper[4775]: E0127 11:22:26.745107 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:26 crc kubenswrapper[4775]: E0127 11:22:26.745229 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:26 crc kubenswrapper[4775]: E0127 11:22:26.858648 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 11:22:28 crc kubenswrapper[4775]: I0127 11:22:28.744658 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:28 crc kubenswrapper[4775]: I0127 11:22:28.744691 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:28 crc kubenswrapper[4775]: I0127 11:22:28.744718 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:28 crc kubenswrapper[4775]: I0127 11:22:28.744664 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:28 crc kubenswrapper[4775]: E0127 11:22:28.744833 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:28 crc kubenswrapper[4775]: E0127 11:22:28.744936 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:28 crc kubenswrapper[4775]: E0127 11:22:28.745044 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:28 crc kubenswrapper[4775]: E0127 11:22:28.745123 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:30 crc kubenswrapper[4775]: I0127 11:22:30.744299 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:30 crc kubenswrapper[4775]: I0127 11:22:30.744353 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:30 crc kubenswrapper[4775]: I0127 11:22:30.744401 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:30 crc kubenswrapper[4775]: I0127 11:22:30.744420 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:30 crc kubenswrapper[4775]: E0127 11:22:30.744641 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:30 crc kubenswrapper[4775]: E0127 11:22:30.744856 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:30 crc kubenswrapper[4775]: E0127 11:22:30.744971 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:30 crc kubenswrapper[4775]: E0127 11:22:30.745090 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:30 crc kubenswrapper[4775]: I0127 11:22:30.746185 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.434077 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/3.log" Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.436895 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.437849 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"fff264ae37c862c92f04505830404488875026a16f9b83753ca7e41d83f2d007"} Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.438392 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.470159 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podStartSLOduration=106.470134032 podStartE2EDuration="1m46.470134032s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:31.468149527 +0000 UTC m=+130.609747334" watchObservedRunningTime="2026-01-27 11:22:31.470134032 +0000 UTC m=+130.611731839" Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.581917 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-b48nk"] Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.582044 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:31 crc kubenswrapper[4775]: E0127 11:22:31.582180 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:31 crc kubenswrapper[4775]: E0127 11:22:31.861134 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 11:22:32 crc kubenswrapper[4775]: I0127 11:22:32.744509 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:32 crc kubenswrapper[4775]: I0127 11:22:32.744625 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:32 crc kubenswrapper[4775]: E0127 11:22:32.744682 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:32 crc kubenswrapper[4775]: I0127 11:22:32.744705 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:32 crc kubenswrapper[4775]: E0127 11:22:32.744842 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:32 crc kubenswrapper[4775]: E0127 11:22:32.744950 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:33 crc kubenswrapper[4775]: I0127 11:22:33.744104 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:33 crc kubenswrapper[4775]: E0127 11:22:33.744621 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:34 crc kubenswrapper[4775]: I0127 11:22:34.744182 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:34 crc kubenswrapper[4775]: I0127 11:22:34.744243 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:34 crc kubenswrapper[4775]: E0127 11:22:34.744358 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:34 crc kubenswrapper[4775]: E0127 11:22:34.744574 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:34 crc kubenswrapper[4775]: I0127 11:22:34.744200 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:34 crc kubenswrapper[4775]: E0127 11:22:34.744894 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:35 crc kubenswrapper[4775]: I0127 11:22:35.744056 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:35 crc kubenswrapper[4775]: I0127 11:22:35.744417 4775 scope.go:117] "RemoveContainer" containerID="750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298" Jan 27 11:22:35 crc kubenswrapper[4775]: E0127 11:22:35.744384 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:36 crc kubenswrapper[4775]: I0127 11:22:36.457955 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/1.log" Jan 27 11:22:36 crc kubenswrapper[4775]: I0127 11:22:36.458032 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerStarted","Data":"bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09"} Jan 27 11:22:36 crc kubenswrapper[4775]: I0127 11:22:36.744404 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:36 crc kubenswrapper[4775]: I0127 11:22:36.744507 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:36 crc kubenswrapper[4775]: E0127 11:22:36.744650 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:36 crc kubenswrapper[4775]: E0127 11:22:36.744780 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:36 crc kubenswrapper[4775]: I0127 11:22:36.745943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:36 crc kubenswrapper[4775]: E0127 11:22:36.746311 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:36 crc kubenswrapper[4775]: E0127 11:22:36.862371 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 11:22:37 crc kubenswrapper[4775]: I0127 11:22:37.744260 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:37 crc kubenswrapper[4775]: E0127 11:22:37.744520 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:38 crc kubenswrapper[4775]: I0127 11:22:38.744537 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:38 crc kubenswrapper[4775]: I0127 11:22:38.744566 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:38 crc kubenswrapper[4775]: I0127 11:22:38.744641 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:38 crc kubenswrapper[4775]: E0127 11:22:38.744753 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:38 crc kubenswrapper[4775]: E0127 11:22:38.744894 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:38 crc kubenswrapper[4775]: E0127 11:22:38.745049 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:39 crc kubenswrapper[4775]: I0127 11:22:39.744349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:39 crc kubenswrapper[4775]: E0127 11:22:39.744593 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:40 crc kubenswrapper[4775]: I0127 11:22:40.744717 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:40 crc kubenswrapper[4775]: I0127 11:22:40.744795 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:40 crc kubenswrapper[4775]: I0127 11:22:40.744817 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:40 crc kubenswrapper[4775]: E0127 11:22:40.744906 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:40 crc kubenswrapper[4775]: E0127 11:22:40.745073 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:40 crc kubenswrapper[4775]: E0127 11:22:40.745252 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:41 crc kubenswrapper[4775]: I0127 11:22:41.744843 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:41 crc kubenswrapper[4775]: E0127 11:22:41.746789 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.744190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.744292 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.744371 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.747744 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.747895 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.748179 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.748712 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 11:22:43 crc kubenswrapper[4775]: I0127 11:22:43.744121 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:43 crc kubenswrapper[4775]: I0127 11:22:43.747011 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 11:22:43 crc kubenswrapper[4775]: I0127 11:22:43.747440 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 11:22:43 crc kubenswrapper[4775]: I0127 11:22:43.827503 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.899019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.944082 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-sknjj"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.944706 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.945156 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.945579 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.950840 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.951760 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.951839 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.953163 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.953682 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.954962 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.955001 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.955102 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.955505 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.955929 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.957304 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.957490 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.957530 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.960045 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pr8gf"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.960920 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.964062 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qcw27"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.965061 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.974557 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zcbc6"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.976239 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.976704 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977272 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977429 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977626 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977677 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977905 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-7bkr9"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977968 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978209 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978510 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978672 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-serving-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978738 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978236 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979085 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978741 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit-dir\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979178 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0bdc0fe8-51ba-4939-9220-5f45a846f997-machine-approver-tls\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979203 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jg66\" (UniqueName: \"kubernetes.io/projected/67761d7d-66a6-4808-803a-bf68ae3186a6-kube-api-access-6jg66\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979226 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e034909-37ed-4437-a799-daf81cbe8241-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979252 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0acb956-caf6-4999-bc3b-02c0195fe7ad-serving-cert\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979273 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m89cw\" (UniqueName: \"kubernetes.io/projected/0bdc0fe8-51ba-4939-9220-5f45a846f997-kube-api-access-m89cw\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e034909-37ed-4437-a799-daf81cbe8241-serving-cert\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979314 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpgss\" (UniqueName: \"kubernetes.io/projected/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-kube-api-access-lpgss\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979348 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979384 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979410 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpnn\" (UniqueName: \"kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979418 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979436 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979479 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-auth-proxy-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979502 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-encryption-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979525 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979544 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-client\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978283 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979565 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-image-import-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979590 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-images\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979615 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-node-pullsecrets\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978383 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979641 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979666 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-config\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978434 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979695 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-serving-cert\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979718 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-config\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978844 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979765 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc9r5\" (UniqueName: \"kubernetes.io/projected/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-kube-api-access-wc9r5\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979791 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979804 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978931 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979835 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-service-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979865 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/d0acb956-caf6-4999-bc3b-02c0195fe7ad-kube-api-access-jchqn\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978943 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980414 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980722 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/67761d7d-66a6-4808-803a-bf68ae3186a6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlg6m\" (UniqueName: \"kubernetes.io/projected/3e034909-37ed-4437-a799-daf81cbe8241-kube-api-access-mlg6m\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980807 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980838 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980868 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980958 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.981072 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.985686 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.986032 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.986611 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.986804 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.987087 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.987153 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.987842 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988187 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988262 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988467 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988695 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988796 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988988 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.989261 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.989481 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.989592 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.989751 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.989845 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.990034 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.993610 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.993748 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.996431 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:44.999421 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-r4wxp"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:44.999761 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.000071 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.000519 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l7rtf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.000869 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.001202 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.001524 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.002658 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-97tsz"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.002959 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.003502 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004069 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004368 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004497 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004639 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004833 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004895 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.005705 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.006192 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.011924 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.013187 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.013624 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.014406 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.015697 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.016278 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.016859 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.017047 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.017347 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.019864 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.020109 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.020546 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.021198 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.021369 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.021522 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.021553 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.015803 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.022358 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.022523 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.023805 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.024217 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.024532 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.026273 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.024541 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.027295 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.025084 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-p6jjk"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.034027 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.034440 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.034718 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.035045 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.035361 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.035892 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.035932 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.036818 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.049558 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.050011 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.050375 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.051695 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.052307 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.052488 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.053876 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055033 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055108 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055322 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055707 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055841 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055981 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.056171 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.056281 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.056352 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.056834 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.056900 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057241 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057289 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057067 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057122 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057168 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057215 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057230 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.058144 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.058314 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.058650 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.058746 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.059190 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.060400 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.060580 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.060995 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.065264 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.066737 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.066820 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.067357 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.068487 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9m7rd"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.072339 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.072688 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.073900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.074717 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.075760 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.079527 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-krl46"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.080272 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.090580 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.091690 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.092586 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.092868 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc9r5\" (UniqueName: \"kubernetes.io/projected/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-kube-api-access-wc9r5\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095108 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095148 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095537 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095720 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-service-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095766 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/d0acb956-caf6-4999-bc3b-02c0195fe7ad-kube-api-access-jchqn\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095798 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/67761d7d-66a6-4808-803a-bf68ae3186a6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095832 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlg6m\" (UniqueName: \"kubernetes.io/projected/3e034909-37ed-4437-a799-daf81cbe8241-kube-api-access-mlg6m\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095870 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095907 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjxvr\" (UniqueName: \"kubernetes.io/projected/86325a44-a87c-4898-90ce-1d402f969d3a-kube-api-access-sjxvr\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095941 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8lkh\" (UniqueName: \"kubernetes.io/projected/e7329644-12a0-4c3e-8a2a-2c38a7b78369-kube-api-access-b8lkh\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095977 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096007 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096033 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-encryption-config\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096063 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-serving-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096127 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit-dir\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096187 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096216 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0bdc0fe8-51ba-4939-9220-5f45a846f997-machine-approver-tls\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096242 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxf7k\" (UniqueName: \"kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096274 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096304 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jg66\" (UniqueName: \"kubernetes.io/projected/67761d7d-66a6-4808-803a-bf68ae3186a6-kube-api-access-6jg66\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096333 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e034909-37ed-4437-a799-daf81cbe8241-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096359 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7329644-12a0-4c3e-8a2a-2c38a7b78369-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096392 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0acb956-caf6-4999-bc3b-02c0195fe7ad-serving-cert\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096423 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m89cw\" (UniqueName: \"kubernetes.io/projected/0bdc0fe8-51ba-4939-9220-5f45a846f997-kube-api-access-m89cw\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096472 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e034909-37ed-4437-a799-daf81cbe8241-serving-cert\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096499 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpgss\" (UniqueName: \"kubernetes.io/projected/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-kube-api-access-lpgss\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096524 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66668\" (UniqueName: \"kubernetes.io/projected/9ad82a99-23f4-4f61-9fa9-535b29e11fc3-kube-api-access-66668\") pod \"downloads-7954f5f757-7bkr9\" (UID: \"9ad82a99-23f4-4f61-9fa9-535b29e11fc3\") " pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096556 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096561 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096626 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096655 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpnn\" (UniqueName: \"kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-audit-policies\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7329644-12a0-4c3e-8a2a-2c38a7b78369-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096732 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096766 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096813 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-auth-proxy-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096833 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-encryption-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096855 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096876 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-client\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096900 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-image-import-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096924 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-client\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096947 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-images\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096974 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-node-pullsecrets\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096996 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86325a44-a87c-4898-90ce-1d402f969d3a-audit-dir\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097021 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097039 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-config\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097058 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097081 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097102 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-serving-cert\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097124 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097139 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-config\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097159 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-serving-cert\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097440 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.098141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-service-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.098974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.099270 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-node-pullsecrets\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.100075 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.100540 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e034909-37ed-4437-a799-daf81cbe8241-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.100958 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-config\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.103640 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-serving-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.104376 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/67761d7d-66a6-4808-803a-bf68ae3186a6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.105376 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-images\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.105544 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.105730 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.105784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0bdc0fe8-51ba-4939-9220-5f45a846f997-machine-approver-tls\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.106349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-auth-proxy-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.106376 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.106357 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit-dir\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.106697 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-image-import-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.107016 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-client\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.108113 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-config\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.108490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-encryption-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.111164 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.112009 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-serving-cert\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.115482 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e034909-37ed-4437-a799-daf81cbe8241-serving-cert\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.118704 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.119244 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.120779 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.121762 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.122418 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mks6w"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.126043 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0acb956-caf6-4999-bc3b-02c0195fe7ad-serving-cert\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.127712 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.128000 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.128335 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.129302 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.129601 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.130530 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.130575 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.131093 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.132147 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.133837 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-sknjj"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.136355 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.137616 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pr8gf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.138650 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.139584 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zcbc6"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.140472 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dqrtf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.141359 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.141426 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l7rtf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.142686 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.143385 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9m7rd"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.144331 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qcw27"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.145236 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.146191 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.147536 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.148023 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.148768 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.155276 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.157251 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.159787 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.162897 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.166224 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7bkr9"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.167976 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.171266 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-p6jjk"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.171878 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.174840 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.177050 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.178732 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cnwdf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.179221 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.180657 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.181894 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.183301 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.184579 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.186869 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.187143 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.188974 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-r4wxp"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.190637 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cnwdf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.191797 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-krl46"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.193247 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.194503 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.195620 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mks6w"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.196982 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.197946 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjxvr\" (UniqueName: \"kubernetes.io/projected/86325a44-a87c-4898-90ce-1d402f969d3a-kube-api-access-sjxvr\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.197971 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8lkh\" (UniqueName: \"kubernetes.io/projected/e7329644-12a0-4c3e-8a2a-2c38a7b78369-kube-api-access-b8lkh\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.197995 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-encryption-config\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198017 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-cabundle\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198036 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198051 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-config\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198070 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfeddd59-a473-4baa-83d8-4bba68575acb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198103 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198120 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxf7k\" (UniqueName: \"kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198137 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198160 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7329644-12a0-4c3e-8a2a-2c38a7b78369-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198187 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66668\" (UniqueName: \"kubernetes.io/projected/9ad82a99-23f4-4f61-9fa9-535b29e11fc3-kube-api-access-66668\") pod \"downloads-7954f5f757-7bkr9\" (UID: \"9ad82a99-23f4-4f61-9fa9-535b29e11fc3\") " pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-key\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198259 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt8vn\" (UniqueName: \"kubernetes.io/projected/70e56eaf-e2b2-4431-988a-e39e37012771-kube-api-access-pt8vn\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198319 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-audit-policies\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198346 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7329644-12a0-4c3e-8a2a-2c38a7b78369-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198374 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198409 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-serving-cert\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-client\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198495 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86325a44-a87c-4898-90ce-1d402f969d3a-audit-dir\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198522 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198549 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-service-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198571 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-client\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198586 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198598 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198653 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198691 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvpl2\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-kube-api-access-bvpl2\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198734 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-serving-cert\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198815 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfeddd59-a473-4baa-83d8-4bba68575acb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199318 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-audit-policies\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199424 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9wfl\" (UniqueName: \"kubernetes.io/projected/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-kube-api-access-v9wfl\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199484 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199568 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86325a44-a87c-4898-90ce-1d402f969d3a-audit-dir\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199568 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199718 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.200993 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.201137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-client\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.201182 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-encryption-config\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.201629 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-serving-cert\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.202003 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dqrtf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.203292 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.204783 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-wn6qf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.205429 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.205976 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.207314 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.227216 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.255498 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.260017 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w97mp"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.262512 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.265022 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w97mp"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.267390 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.287630 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300666 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-key\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt8vn\" (UniqueName: \"kubernetes.io/projected/70e56eaf-e2b2-4431-988a-e39e37012771-kube-api-access-pt8vn\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300822 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-serving-cert\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300844 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-service-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300929 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-client\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300970 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvpl2\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-kube-api-access-bvpl2\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301009 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfeddd59-a473-4baa-83d8-4bba68575acb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301047 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9wfl\" (UniqueName: \"kubernetes.io/projected/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-kube-api-access-v9wfl\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301067 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301100 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-cabundle\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301119 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-config\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301138 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfeddd59-a473-4baa-83d8-4bba68575acb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301562 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-service-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.302043 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-config\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.302125 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfeddd59-a473-4baa-83d8-4bba68575acb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.302189 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.303933 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-client\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.304157 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-serving-cert\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.305239 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfeddd59-a473-4baa-83d8-4bba68575acb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.307363 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.327469 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.347608 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.367480 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.388185 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.407499 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.427652 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.447760 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.467812 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.487754 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.507340 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.528055 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.547749 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.567252 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.587794 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.593403 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7329644-12a0-4c3e-8a2a-2c38a7b78369-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.608751 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.627643 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.647566 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.651439 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7329644-12a0-4c3e-8a2a-2c38a7b78369-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.668047 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.687985 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.708604 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.728374 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.748113 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.767409 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.788510 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.808366 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.827943 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.848079 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.867916 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.894110 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.907537 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.928177 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.947903 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.967694 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.988479 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.007790 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.027360 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.047757 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.066315 4775 request.go:700] Waited for 1.006672158s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-q9whj Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.087640 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.107763 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.128100 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.149035 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.168386 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.187535 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.207831 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.214035 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-cabundle\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.229392 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.248500 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.267718 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.279335 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-key\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.290530 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.308096 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.328829 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.355343 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.367979 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.387576 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.407474 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.428647 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.448022 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.468217 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.530255 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.548129 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.553618 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc9r5\" (UniqueName: \"kubernetes.io/projected/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-kube-api-access-wc9r5\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.569612 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.588269 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.608969 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.654799 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpnn\" (UniqueName: \"kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.663410 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/d0acb956-caf6-4999-bc3b-02c0195fe7ad-kube-api-access-jchqn\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.705929 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlg6m\" (UniqueName: \"kubernetes.io/projected/3e034909-37ed-4437-a799-daf81cbe8241-kube-api-access-mlg6m\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.725024 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m89cw\" (UniqueName: \"kubernetes.io/projected/0bdc0fe8-51ba-4939-9220-5f45a846f997-kube-api-access-m89cw\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.726714 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jg66\" (UniqueName: \"kubernetes.io/projected/67761d7d-66a6-4808-803a-bf68ae3186a6-kube-api-access-6jg66\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.745126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpgss\" (UniqueName: \"kubernetes.io/projected/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-kube-api-access-lpgss\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.747920 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.768091 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.781582 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.787717 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.808655 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.827978 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.846824 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.848835 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.868385 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.868967 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: W0127 11:22:46.886236 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bdc0fe8_51ba_4939_9220_5f45a846f997.slice/crio-12f97c8fda8b0cea361867806196057567967e376da54523dd191fca955d2cf0 WatchSource:0}: Error finding container 12f97c8fda8b0cea361867806196057567967e376da54523dd191fca955d2cf0: Status 404 returned error can't find the container with id 12f97c8fda8b0cea361867806196057567967e376da54523dd191fca955d2cf0 Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.887432 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.892192 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.913169 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.928894 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.944054 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.949787 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.950196 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.967994 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.970343 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.988925 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.011828 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.048081 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66668\" (UniqueName: \"kubernetes.io/projected/9ad82a99-23f4-4f61-9fa9-535b29e11fc3-kube-api-access-66668\") pod \"downloads-7954f5f757-7bkr9\" (UID: \"9ad82a99-23f4-4f61-9fa9-535b29e11fc3\") " pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.066703 4775 request.go:700] Waited for 1.867298117s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.071656 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjxvr\" (UniqueName: \"kubernetes.io/projected/86325a44-a87c-4898-90ce-1d402f969d3a-kube-api-access-sjxvr\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.090707 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxf7k\" (UniqueName: \"kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.100842 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.108385 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.109934 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8lkh\" (UniqueName: \"kubernetes.io/projected/e7329644-12a0-4c3e-8a2a-2c38a7b78369-kube-api-access-b8lkh\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.134580 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.135747 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-sknjj"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.148272 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.153585 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pr8gf"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.168933 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.188180 4775 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.207693 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.221371 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:22:47 crc kubenswrapper[4775]: W0127 11:22:47.234569 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1b6882d_984d_432b_b3df_101a6437371b.slice/crio-f1d7f91efbd16850b79ed6c4723629965776aad4a43a007e3ed55d3f13cef28e WatchSource:0}: Error finding container f1d7f91efbd16850b79ed6c4723629965776aad4a43a007e3ed55d3f13cef28e: Status 404 returned error can't find the container with id f1d7f91efbd16850b79ed6c4723629965776aad4a43a007e3ed55d3f13cef28e Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.247012 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.256112 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qcw27"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.261853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt8vn\" (UniqueName: \"kubernetes.io/projected/70e56eaf-e2b2-4431-988a-e39e37012771-kube-api-access-pt8vn\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.262689 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:47 crc kubenswrapper[4775]: W0127 11:22:47.265898 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e034909_37ed_4437_a799_daf81cbe8241.slice/crio-6d82b495c125c06cefa89d85b491c1e520f93cd0cb9adfa7e560b7eb20d1f7fc WatchSource:0}: Error finding container 6d82b495c125c06cefa89d85b491c1e520f93cd0cb9adfa7e560b7eb20d1f7fc: Status 404 returned error can't find the container with id 6d82b495c125c06cefa89d85b491c1e520f93cd0cb9adfa7e560b7eb20d1f7fc Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.283878 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.283963 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.287228 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvpl2\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-kube-api-access-bvpl2\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.304412 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9wfl\" (UniqueName: \"kubernetes.io/projected/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-kube-api-access-v9wfl\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.322506 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.334100 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338130 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1dcaba9-07f4-405b-97bf-4575b0edacc5-proxy-tls\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338172 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338194 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338217 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6glzt\" (UniqueName: \"kubernetes.io/projected/a9987fd7-5b35-449c-b24a-a38afb77db17-kube-api-access-6glzt\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91f00de9-b734-4644-9164-b4b6c990aeb3-metrics-tls\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338260 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d90473f1-e47f-453c-bbe4-52e528e160de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338351 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338482 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbwzf\" (UniqueName: \"kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338539 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3655cf31-d392-485f-ba8c-13ccddbe46e1-config\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338573 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338611 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-trusted-ca\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338629 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6zlg\" (UniqueName: \"kubernetes.io/projected/91f00de9-b734-4644-9164-b4b6c990aeb3-kube-api-access-j6zlg\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338647 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a042536-1621-4dae-8564-a3de61645643-trusted-ca\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05079674-b89f-4310-98f0-b39caf8f6189-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339057 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebadab77-f881-4ec4-937f-eef9a677edfe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339093 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6kdj\" (UniqueName: \"kubernetes.io/projected/65491a7a-a22b-4993-aef2-42e752143efd-kube-api-access-d6kdj\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339116 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339492 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339647 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339673 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-config\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339728 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339744 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339815 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/87a94d4a-7341-4e6c-8194-a2e6832dbb01-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339835 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339894 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5rhh\" (UniqueName: \"kubernetes.io/projected/4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61-kube-api-access-w5rhh\") pod \"migrator-59844c95c7-z9rvc\" (UID: \"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339919 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d90473f1-e47f-453c-bbe4-52e528e160de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339969 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339990 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340006 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340060 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q8jd\" (UniqueName: \"kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340077 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1dcaba9-07f4-405b-97bf-4575b0edacc5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdpxd\" (UniqueName: \"kubernetes.io/projected/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-kube-api-access-gdpxd\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340133 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340149 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340212 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebadab77-f881-4ec4-937f-eef9a677edfe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340228 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-images\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340284 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3655cf31-d392-485f-ba8c-13ccddbe46e1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340302 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05079674-b89f-4310-98f0-b39caf8f6189-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340317 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-metrics-certs\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340393 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlvts\" (UniqueName: \"kubernetes.io/projected/d90473f1-e47f-453c-bbe4-52e528e160de-kube-api-access-qlvts\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340427 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05079674-b89f-4310-98f0-b39caf8f6189-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340475 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340501 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9987fd7-5b35-449c-b24a-a38afb77db17-service-ca-bundle\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340521 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340587 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340608 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp9bb\" (UniqueName: \"kubernetes.io/projected/87a94d4a-7341-4e6c-8194-a2e6832dbb01-kube-api-access-zp9bb\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340641 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-stats-auth\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340679 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvbnl\" (UniqueName: \"kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.341847 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-default-certificate\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.341880 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.341897 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6dt8\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-kube-api-access-m6dt8\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342439 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjv2j\" (UniqueName: \"kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342500 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342523 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3655cf31-d392-485f-ba8c-13ccddbe46e1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342581 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-serving-cert\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342846 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342875 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342938 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343234 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jt5p\" (UniqueName: \"kubernetes.io/projected/b1dcaba9-07f4-405b-97bf-4575b0edacc5-kube-api-access-8jt5p\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343388 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a042536-1621-4dae-8564-a3de61645643-metrics-tls\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.343435 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:47.843421865 +0000 UTC m=+146.985019642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343480 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343516 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343606 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343624 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343962 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344005 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-config\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344037 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344068 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-629ps\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344099 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344146 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65491a7a-a22b-4993-aef2-42e752143efd-proxy-tls\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344759 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsdfw\" (UniqueName: \"kubernetes.io/projected/ebadab77-f881-4ec4-937f-eef9a677edfe-kube-api-access-jsdfw\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.391113 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.445988 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446240 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5rhh\" (UniqueName: \"kubernetes.io/projected/4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61-kube-api-access-w5rhh\") pod \"migrator-59844c95c7-z9rvc\" (UID: \"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446270 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d90473f1-e47f-453c-bbe4-52e528e160de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57dm\" (UniqueName: \"kubernetes.io/projected/27d889ae-fa92-40b8-800d-d61fb92d618d-kube-api-access-z57dm\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446317 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-srv-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446334 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446355 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446380 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446397 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q8jd\" (UniqueName: \"kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446415 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446433 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446476 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qksnf\" (UniqueName: \"kubernetes.io/projected/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-kube-api-access-qksnf\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446514 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1dcaba9-07f4-405b-97bf-4575b0edacc5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446535 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdpxd\" (UniqueName: \"kubernetes.io/projected/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-kube-api-access-gdpxd\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446559 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebadab77-f881-4ec4-937f-eef9a677edfe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-images\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446635 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3655cf31-d392-485f-ba8c-13ccddbe46e1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05079674-b89f-4310-98f0-b39caf8f6189-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446696 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-metrics-certs\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mszc7\" (UniqueName: \"kubernetes.io/projected/5cf81fd9-7041-48eb-acff-470663fc9987-kube-api-access-mszc7\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446774 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446815 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9987fd7-5b35-449c-b24a-a38afb77db17-service-ca-bundle\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446838 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlvts\" (UniqueName: \"kubernetes.io/projected/d90473f1-e47f-453c-bbe4-52e528e160de-kube-api-access-qlvts\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446861 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05079674-b89f-4310-98f0-b39caf8f6189-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446946 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxl9x\" (UniqueName: \"kubernetes.io/projected/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-kube-api-access-rxl9x\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446970 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp9bb\" (UniqueName: \"kubernetes.io/projected/87a94d4a-7341-4e6c-8194-a2e6832dbb01-kube-api-access-zp9bb\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447015 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a3e92f-cb64-4857-8e1a-4da128f94f55-config-volume\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447048 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-stats-auth\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447071 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvbnl\" (UniqueName: \"kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447094 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9632ab24-73c1-4940-a642-482850dc4fe4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447129 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-default-certificate\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447148 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-certs\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447176 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447200 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6dt8\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-kube-api-access-m6dt8\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447236 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjv2j\" (UniqueName: \"kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447261 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447283 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-socket-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447307 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447327 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-registration-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447355 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-serving-cert\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447374 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3655cf31-d392-485f-ba8c-13ccddbe46e1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447396 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-apiservice-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447421 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447437 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447509 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447533 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27d889ae-fa92-40b8-800d-d61fb92d618d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447555 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a042536-1621-4dae-8564-a3de61645643-metrics-tls\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447600 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jt5p\" (UniqueName: \"kubernetes.io/projected/b1dcaba9-07f4-405b-97bf-4575b0edacc5-kube-api-access-8jt5p\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447623 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447645 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-csi-data-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447668 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447745 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447769 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98b6h\" (UniqueName: \"kubernetes.io/projected/9632ab24-73c1-4940-a642-482850dc4fe4-kube-api-access-98b6h\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447793 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-srv-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447833 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-config\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447860 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447884 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447909 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-629ps\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447932 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447954 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5cf81fd9-7041-48eb-acff-470663fc9987-tmpfs\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447983 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65491a7a-a22b-4993-aef2-42e752143efd-proxy-tls\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448006 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-config\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448018 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448032 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsdfw\" (UniqueName: \"kubernetes.io/projected/ebadab77-f881-4ec4-937f-eef9a677edfe-kube-api-access-jsdfw\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448056 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvstm\" (UniqueName: \"kubernetes.io/projected/72a39a16-e53a-42b6-a71f-35d74ef633b6-kube-api-access-vvstm\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448083 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qthbw\" (UniqueName: \"kubernetes.io/projected/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-kube-api-access-qthbw\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448105 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448133 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448156 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgp8q\" (UniqueName: \"kubernetes.io/projected/c5daf300-a879-408f-a78a-c70b0e77f54c-kube-api-access-bgp8q\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448268 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1dcaba9-07f4-405b-97bf-4575b0edacc5-proxy-tls\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448306 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6glzt\" (UniqueName: \"kubernetes.io/projected/a9987fd7-5b35-449c-b24a-a38afb77db17-kube-api-access-6glzt\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448332 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl4j7\" (UniqueName: \"kubernetes.io/projected/06a3e92f-cb64-4857-8e1a-4da128f94f55-kube-api-access-jl4j7\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448369 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91f00de9-b734-4644-9164-b4b6c990aeb3-metrics-tls\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448387 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d90473f1-e47f-453c-bbe4-52e528e160de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448404 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3655cf31-d392-485f-ba8c-13ccddbe46e1-config\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448420 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448440 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbwzf\" (UniqueName: \"kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.448602 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:47.948572543 +0000 UTC m=+147.090170420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448974 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-serving-cert\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449008 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-trusted-ca\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449032 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-webhook-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449054 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a042536-1621-4dae-8564-a3de61645643-trusted-ca\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449076 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05079674-b89f-4310-98f0-b39caf8f6189-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449114 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6zlg\" (UniqueName: \"kubernetes.io/projected/91f00de9-b734-4644-9164-b4b6c990aeb3-kube-api-access-j6zlg\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449150 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-plugins-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebadab77-f881-4ec4-937f-eef9a677edfe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3655cf31-d392-485f-ba8c-13ccddbe46e1-config\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.450281 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.450312 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.450728 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452336 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-images\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449217 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452719 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-cert\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452777 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6kdj\" (UniqueName: \"kubernetes.io/projected/65491a7a-a22b-4993-aef2-42e752143efd-kube-api-access-d6kdj\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452802 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452818 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452868 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06a3e92f-cb64-4857-8e1a-4da128f94f55-metrics-tls\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452884 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d90473f1-e47f-453c-bbe4-52e528e160de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452929 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-profile-collector-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453027 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453165 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453189 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-mountpoint-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-node-bootstrap-token\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453211 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453226 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-config\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453248 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453244 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453267 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453284 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453326 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/87a94d4a-7341-4e6c-8194-a2e6832dbb01-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453348 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453366 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghc2h\" (UniqueName: \"kubernetes.io/projected/03843cd3-d8c8-4007-b9d5-c1d2254c1677-kube-api-access-ghc2h\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453492 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-trusted-ca\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453919 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05079674-b89f-4310-98f0-b39caf8f6189-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.454126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9987fd7-5b35-449c-b24a-a38afb77db17-service-ca-bundle\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.454638 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.454707 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-config\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.455628 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-serving-cert\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.455683 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.455808 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.456176 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a042536-1621-4dae-8564-a3de61645643-trusted-ca\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.457222 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebadab77-f881-4ec4-937f-eef9a677edfe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.457724 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65491a7a-a22b-4993-aef2-42e752143efd-proxy-tls\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.458418 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1dcaba9-07f4-405b-97bf-4575b0edacc5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.458913 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.459568 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.460657 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.461092 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-stats-auth\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.461885 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a042536-1621-4dae-8564-a3de61645643-metrics-tls\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.462156 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05079674-b89f-4310-98f0-b39caf8f6189-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.462755 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3655cf31-d392-485f-ba8c-13ccddbe46e1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.466185 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.466488 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.466544 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-metrics-certs\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.466979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.468641 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.469540 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.470028 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebadab77-f881-4ec4-937f-eef9a677edfe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.470364 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zcbc6"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.470787 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.471054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-config\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.471229 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d90473f1-e47f-453c-bbe4-52e528e160de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.472438 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.472961 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91f00de9-b734-4644-9164-b4b6c990aeb3-metrics-tls\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473108 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1dcaba9-07f4-405b-97bf-4575b0edacc5-proxy-tls\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473401 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473567 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-default-certificate\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473792 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.474127 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.474817 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.477918 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/87a94d4a-7341-4e6c-8194-a2e6832dbb01-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.478671 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.481397 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7bkr9"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.482391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5rhh\" (UniqueName: \"kubernetes.io/projected/4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61-kube-api-access-w5rhh\") pod \"migrator-59844c95c7-z9rvc\" (UID: \"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" Jan 27 11:22:47 crc kubenswrapper[4775]: W0127 11:22:47.484425 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13ee778_6aa2_4c33_92f6_1bddaadc2f82.slice/crio-a91870d7acb2e479eee9bb28808ce3ec64c0a74b2d611d0b8ea0faf7bea28d30 WatchSource:0}: Error finding container a91870d7acb2e479eee9bb28808ce3ec64c0a74b2d611d0b8ea0faf7bea28d30: Status 404 returned error can't find the container with id a91870d7acb2e479eee9bb28808ce3ec64c0a74b2d611d0b8ea0faf7bea28d30 Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.507994 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.508269 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" event={"ID":"c13ee778-6aa2-4c33-92f6-1bddaadc2f82","Type":"ContainerStarted","Data":"a91870d7acb2e479eee9bb28808ce3ec64c0a74b2d611d0b8ea0faf7bea28d30"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.509726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" event={"ID":"3e034909-37ed-4437-a799-daf81cbe8241","Type":"ContainerStarted","Data":"6d82b495c125c06cefa89d85b491c1e520f93cd0cb9adfa7e560b7eb20d1f7fc"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.510610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" event={"ID":"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd","Type":"ContainerStarted","Data":"1034cc44d2c89563511470e31f3454cf45e16be2f2cd722989c87d82c6930bfd"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.511464 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" event={"ID":"0bdc0fe8-51ba-4939-9220-5f45a846f997","Type":"ContainerStarted","Data":"12f97c8fda8b0cea361867806196057567967e376da54523dd191fca955d2cf0"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.513560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" event={"ID":"e1b6882d-984d-432b-b3df-101a6437371b","Type":"ContainerStarted","Data":"f1d7f91efbd16850b79ed6c4723629965776aad4a43a007e3ed55d3f13cef28e"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.514684 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7bkr9" event={"ID":"9ad82a99-23f4-4f61-9fa9-535b29e11fc3","Type":"ContainerStarted","Data":"fad1c9b987afda1f535dc920d67215cf88eaea92a38410980f3a3c65e1d900df"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.517372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" event={"ID":"d0acb956-caf6-4999-bc3b-02c0195fe7ad","Type":"ContainerStarted","Data":"ec2b93bd5af82503b510050e9eebe4a74b0576f88a8cab3ce9b60a530649a349"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.525185 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjv2j\" (UniqueName: \"kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.554650 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.555304 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.555336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-apiservice-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.555360 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27d889ae-fa92-40b8-800d-d61fb92d618d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.555394 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-csi-data-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.555438 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98b6h\" (UniqueName: \"kubernetes.io/projected/9632ab24-73c1-4940-a642-482850dc4fe4-kube-api-access-98b6h\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.556357 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.056342062 +0000 UTC m=+147.197939829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.557323 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp9bb\" (UniqueName: \"kubernetes.io/projected/87a94d4a-7341-4e6c-8194-a2e6832dbb01-kube-api-access-zp9bb\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.557430 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-csi-data-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.561191 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-apiservice-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564551 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-srv-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564623 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5cf81fd9-7041-48eb-acff-470663fc9987-tmpfs\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564659 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-config\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564702 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvstm\" (UniqueName: \"kubernetes.io/projected/72a39a16-e53a-42b6-a71f-35d74ef633b6-kube-api-access-vvstm\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564740 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qthbw\" (UniqueName: \"kubernetes.io/projected/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-kube-api-access-qthbw\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564764 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgp8q\" (UniqueName: \"kubernetes.io/projected/c5daf300-a879-408f-a78a-c70b0e77f54c-kube-api-access-bgp8q\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl4j7\" (UniqueName: \"kubernetes.io/projected/06a3e92f-cb64-4857-8e1a-4da128f94f55-kube-api-access-jl4j7\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564847 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-serving-cert\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564872 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-webhook-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564923 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-plugins-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564946 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-cert\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564985 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06a3e92f-cb64-4857-8e1a-4da128f94f55-metrics-tls\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565047 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-profile-collector-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565075 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-mountpoint-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565098 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-node-bootstrap-token\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565135 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghc2h\" (UniqueName: \"kubernetes.io/projected/03843cd3-d8c8-4007-b9d5-c1d2254c1677-kube-api-access-ghc2h\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565168 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z57dm\" (UniqueName: \"kubernetes.io/projected/27d889ae-fa92-40b8-800d-d61fb92d618d-kube-api-access-z57dm\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565202 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-srv-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565243 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qksnf\" (UniqueName: \"kubernetes.io/projected/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-kube-api-access-qksnf\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565297 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mszc7\" (UniqueName: \"kubernetes.io/projected/5cf81fd9-7041-48eb-acff-470663fc9987-kube-api-access-mszc7\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565335 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxl9x\" (UniqueName: \"kubernetes.io/projected/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-kube-api-access-rxl9x\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565373 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a3e92f-cb64-4857-8e1a-4da128f94f55-config-volume\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9632ab24-73c1-4940-a642-482850dc4fe4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565441 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-certs\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565612 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-socket-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-registration-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565982 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-registration-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.571742 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-mountpoint-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.572284 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5cf81fd9-7041-48eb-acff-470663fc9987-tmpfs\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.573818 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-srv-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.574814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-config\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.575608 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a3e92f-cb64-4857-8e1a-4da128f94f55-config-volume\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.576587 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-cert\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.576824 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-plugins-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.577920 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-certs\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.577946 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-serving-cert\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.578118 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-socket-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.579129 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06a3e92f-cb64-4857-8e1a-4da128f94f55-metrics-tls\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.579382 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.581705 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27d889ae-fa92-40b8-800d-d61fb92d618d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.583965 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9632ab24-73c1-4940-a642-482850dc4fe4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.584814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-srv-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.586720 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvbnl\" (UniqueName: \"kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.587382 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-profile-collector-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.588277 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-webhook-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.588880 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-node-bootstrap-token\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.590526 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.596207 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6dt8\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-kube-api-access-m6dt8\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.608358 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsdfw\" (UniqueName: \"kubernetes.io/projected/ebadab77-f881-4ec4-937f-eef9a677edfe-kube-api-access-jsdfw\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.644680 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.648395 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbwzf\" (UniqueName: \"kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.668210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.668461 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.168402327 +0000 UTC m=+147.310000104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.668940 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.669665 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.169650161 +0000 UTC m=+147.311247938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.684760 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.684878 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q8jd\" (UniqueName: \"kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.691254 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6glzt\" (UniqueName: \"kubernetes.io/projected/a9987fd7-5b35-449c-b24a-a38afb77db17-kube-api-access-6glzt\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.695705 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-r4wxp"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.701334 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.706346 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jt5p\" (UniqueName: \"kubernetes.io/projected/b1dcaba9-07f4-405b-97bf-4575b0edacc5-kube-api-access-8jt5p\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.706548 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.723152 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.729964 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.747153 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3655cf31-d392-485f-ba8c-13ccddbe46e1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.758586 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.759973 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.764514 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.767679 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.769598 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6zlg\" (UniqueName: \"kubernetes.io/projected/91f00de9-b734-4644-9164-b4b6c990aeb3-kube-api-access-j6zlg\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.769750 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.769761 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.770076 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.270061189 +0000 UTC m=+147.411658966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.794509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6kdj\" (UniqueName: \"kubernetes.io/projected/65491a7a-a22b-4993-aef2-42e752143efd-kube-api-access-d6kdj\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: W0127 11:22:47.800164 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfeddd59_a473_4baa_83d8_4bba68575acb.slice/crio-befc71d40c1756016d877f0b4807667f7991943d51ae262e4d12a4c542405a3e WatchSource:0}: Error finding container befc71d40c1756016d877f0b4807667f7991943d51ae262e4d12a4c542405a3e: Status 404 returned error can't find the container with id befc71d40c1756016d877f0b4807667f7991943d51ae262e4d12a4c542405a3e Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.801038 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdpxd\" (UniqueName: \"kubernetes.io/projected/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-kube-api-access-gdpxd\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: W0127 11:22:47.812079 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86325a44_a87c_4898_90ce_1d402f969d3a.slice/crio-164243c46dbf6527fba9f04ce91bb37f7ec8592fbafb3a549555d49d2d3567f1 WatchSource:0}: Error finding container 164243c46dbf6527fba9f04ce91bb37f7ec8592fbafb3a549555d49d2d3567f1: Status 404 returned error can't find the container with id 164243c46dbf6527fba9f04ce91bb37f7ec8592fbafb3a549555d49d2d3567f1 Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.833574 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-629ps\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.838985 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05079674-b89f-4310-98f0-b39caf8f6189-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.863107 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlvts\" (UniqueName: \"kubernetes.io/projected/d90473f1-e47f-453c-bbe4-52e528e160de-kube-api-access-qlvts\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.874208 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.874735 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.374718974 +0000 UTC m=+147.516316751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.886242 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.896990 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.907717 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98b6h\" (UniqueName: \"kubernetes.io/projected/9632ab24-73c1-4940-a642-482850dc4fe4-kube-api-access-98b6h\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.917634 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.923349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl4j7\" (UniqueName: \"kubernetes.io/projected/06a3e92f-cb64-4857-8e1a-4da128f94f55-kube-api-access-jl4j7\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.940114 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.941409 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.946949 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgp8q\" (UniqueName: \"kubernetes.io/projected/c5daf300-a879-408f-a78a-c70b0e77f54c-kube-api-access-bgp8q\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.961214 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.963344 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.965508 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9m7rd"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.967492 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghc2h\" (UniqueName: \"kubernetes.io/projected/03843cd3-d8c8-4007-b9d5-c1d2254c1677-kube-api-access-ghc2h\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.973673 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.974860 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.975003 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.474981249 +0000 UTC m=+147.616579026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.975093 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.975373 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.475363969 +0000 UTC m=+147.616961746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.993886 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvstm\" (UniqueName: \"kubernetes.io/projected/72a39a16-e53a-42b6-a71f-35d74ef633b6-kube-api-access-vvstm\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.003171 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mszc7\" (UniqueName: \"kubernetes.io/projected/5cf81fd9-7041-48eb-acff-470663fc9987-kube-api-access-mszc7\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.018963 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.023614 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.026631 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qksnf\" (UniqueName: \"kubernetes.io/projected/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-kube-api-access-qksnf\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:48 crc kubenswrapper[4775]: W0127 11:22:48.032330 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7329644_12a0_4c3e_8a2a_2c38a7b78369.slice/crio-74bef40596d2132e165443b694178b9a1328d803ec94c3a5d66a8ca1cd41e652 WatchSource:0}: Error finding container 74bef40596d2132e165443b694178b9a1328d803ec94c3a5d66a8ca1cd41e652: Status 404 returned error can't find the container with id 74bef40596d2132e165443b694178b9a1328d803ec94c3a5d66a8ca1cd41e652 Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.039055 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.047051 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.048220 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxl9x\" (UniqueName: \"kubernetes.io/projected/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-kube-api-access-rxl9x\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.061084 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.077600 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.078072 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.57805299 +0000 UTC m=+147.719650777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.083388 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.084496 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.085522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qthbw\" (UniqueName: \"kubernetes.io/projected/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-kube-api-access-qthbw\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.085590 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57dm\" (UniqueName: \"kubernetes.io/projected/27d889ae-fa92-40b8-800d-d61fb92d618d-kube-api-access-z57dm\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.090701 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.102137 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.111247 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.126787 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.127546 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.140037 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.141712 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.167655 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.179335 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.179669 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.679656221 +0000 UTC m=+147.821253998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.187584 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.280527 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.280718 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.780684327 +0000 UTC m=+147.922282104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.281302 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.281697 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.781686394 +0000 UTC m=+147.923284171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.313011 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.377284 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.382749 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.383058 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.883044068 +0000 UTC m=+148.024641845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.419877 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.431296 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-krl46"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.484890 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.485601 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.985577885 +0000 UTC m=+148.127175742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.526088 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" event={"ID":"dfeddd59-a473-4baa-83d8-4bba68575acb","Type":"ContainerStarted","Data":"845e4a4325ac169f816929b0885225b08de69b9962a8ada57d9504a172605f09"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.526136 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" event={"ID":"dfeddd59-a473-4baa-83d8-4bba68575acb","Type":"ContainerStarted","Data":"befc71d40c1756016d877f0b4807667f7991943d51ae262e4d12a4c542405a3e"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.534858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" event={"ID":"0bdc0fe8-51ba-4939-9220-5f45a846f997","Type":"ContainerStarted","Data":"3782a7d51010d42de22c939f82264212b57f408c651f71843c8276f0acdffd3c"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.537571 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" event={"ID":"fe363a11-e8c8-4b4d-8401-25ba48ff00e0","Type":"ContainerStarted","Data":"120ebc6129f5a64b60659c2552ba56b975cf13e595290591c82f5da98dc04421"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.540578 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wn6qf" event={"ID":"72a39a16-e53a-42b6-a71f-35d74ef633b6","Type":"ContainerStarted","Data":"6905af895f525fa2042d1bc8d6f6ef1681b832cd7095f4375386cac0df17c32f"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.542818 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" event={"ID":"04906ea0-5e8b-4e8b-8f20-c46587da8346","Type":"ContainerStarted","Data":"cf78fc6ef9d230c40aed4d7f6b98059ee501a89f8054d5de9225a945bf0f0a69"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.547098 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" event={"ID":"e7329644-12a0-4c3e-8a2a-2c38a7b78369","Type":"ContainerStarted","Data":"74bef40596d2132e165443b694178b9a1328d803ec94c3a5d66a8ca1cd41e652"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.554785 4775 generic.go:334] "Generic (PLEG): container finished" podID="c13ee778-6aa2-4c33-92f6-1bddaadc2f82" containerID="279d14e0b8e0b15a1f08b767f09272885f007922be07a491f5b709d306ea3fca" exitCode=0 Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.554841 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" event={"ID":"c13ee778-6aa2-4c33-92f6-1bddaadc2f82","Type":"ContainerDied","Data":"279d14e0b8e0b15a1f08b767f09272885f007922be07a491f5b709d306ea3fca"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.586375 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.586763 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.086738834 +0000 UTC m=+148.228336611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.588846 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.589984 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.089970682 +0000 UTC m=+148.231568459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.612861 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" event={"ID":"67761d7d-66a6-4808-803a-bf68ae3186a6","Type":"ContainerStarted","Data":"7758ca23a9febf551c373233e9ee3829dbee6201a5ca3f810754a266c30d4eba"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.612911 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" event={"ID":"67761d7d-66a6-4808-803a-bf68ae3186a6","Type":"ContainerStarted","Data":"3758b6fddd1a9e2f318035e5c51070c362b3b79635cb65be625a704fb5189664"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.612927 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" event={"ID":"67761d7d-66a6-4808-803a-bf68ae3186a6","Type":"ContainerStarted","Data":"c4a4c89a6f7e35ac292c985fdc168904c23b58849d7caa287017ed777ca9fda9"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.631152 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" event={"ID":"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61","Type":"ContainerStarted","Data":"0de22aa185f3482beb7638a8c0058e83860be198a7a5af533355dc0f09fc3ecd"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.634306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7bkr9" event={"ID":"9ad82a99-23f4-4f61-9fa9-535b29e11fc3","Type":"ContainerStarted","Data":"f2139e54b6395976f28e92322924d5b93701257940ce4bf65084cd4010f378eb"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.637248 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.639841 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.640890 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" event={"ID":"d0acb956-caf6-4999-bc3b-02c0195fe7ad","Type":"ContainerStarted","Data":"f8bb8a4c324a2ca2550be169f4cf6bf25439f07fa1abbaadf558c3f0b91c501a"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.642950 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" event={"ID":"86325a44-a87c-4898-90ce-1d402f969d3a","Type":"ContainerStarted","Data":"164243c46dbf6527fba9f04ce91bb37f7ec8592fbafb3a549555d49d2d3567f1"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.643108 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.644554 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" event={"ID":"b1dcaba9-07f4-405b-97bf-4575b0edacc5","Type":"ContainerStarted","Data":"59bb84c8b770468d0ee4c1b8565d8c5122a47febd8f7583356c0367f836a0f6c"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.651854 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.652184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" event={"ID":"3e034909-37ed-4437-a799-daf81cbe8241","Type":"ContainerDied","Data":"c17441b28133dd588ecc74a62fd3bd351c1021ba05ce5950a1ab5f82501d69da"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.651869 4775 generic.go:334] "Generic (PLEG): container finished" podID="3e034909-37ed-4437-a799-daf81cbe8241" containerID="c17441b28133dd588ecc74a62fd3bd351c1021ba05ce5950a1ab5f82501d69da" exitCode=0 Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.658697 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" event={"ID":"e1b6882d-984d-432b-b3df-101a6437371b","Type":"ContainerStarted","Data":"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.659393 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.661520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" event={"ID":"70e56eaf-e2b2-4431-988a-e39e37012771","Type":"ContainerStarted","Data":"c9abe0adefd21162a37c2594665905d14a41d4a41b98c5a1c9b0c75971990171"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.666904 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" event={"ID":"68158dce-8840-47f8-8dac-37abc28edc74","Type":"ContainerStarted","Data":"139296b53cfcbab11c8831abaf6a0db6d586bb1a2b9f552fe62be0a6c6fbf343"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.669804 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hj8rf" event={"ID":"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf","Type":"ContainerStarted","Data":"152d04ae80ec3e4ea65562160c2d55c0e2688c495a74f3bc1b1fca916b3879fa"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.671794 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" event={"ID":"02e25ab4-d6d1-40f7-8c8c-3920620cfb98","Type":"ContainerStarted","Data":"10603a5cdc83798ecd347cd871e885f724cc2068ba34682eb62531aed0e55e51"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.673250 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" event={"ID":"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb","Type":"ContainerStarted","Data":"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.673282 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" event={"ID":"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb","Type":"ContainerStarted","Data":"ee61d306bb5f6310bfe18fb9eb63cdf67c00e9b26b5cdca100d7222a8e1ec7f1"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.674222 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.675859 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" event={"ID":"87a94d4a-7341-4e6c-8194-a2e6832dbb01","Type":"ContainerStarted","Data":"e3bd8cfde438c5fb6bd9140a78ceb2157d9f455e3f32509df55aff1acf96c84c"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.679896 4775 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pg564 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.679943 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" podUID="e1b6882d-984d-432b-b3df-101a6437371b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.679901 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-7bkr9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.680199 4775 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ssb2w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.680233 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.680237 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7bkr9" podUID="9ad82a99-23f4-4f61-9fa9-535b29e11fc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.687316 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" event={"ID":"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd","Type":"ContainerStarted","Data":"5e5371f69716ce0574de0a216a303208b8a3e9005e5543c6bab7ada0112b3257"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.687357 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" event={"ID":"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd","Type":"ContainerStarted","Data":"010f1761f2e0c7ba0b9c516f1d430eaa6808e5460d718db4cf032c917790dee6"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.700085 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.702359 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.202332795 +0000 UTC m=+148.343930572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.725613 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-97tsz" event={"ID":"a9987fd7-5b35-449c-b24a-a38afb77db17","Type":"ContainerStarted","Data":"c40ba767d61350abfb2743430a7155d4424a03f05620ffd40b0e7e8166716eee"} Jan 27 11:22:48 crc kubenswrapper[4775]: W0127 11:22:48.744038 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3655cf31_d392_485f_ba8c_13ccddbe46e1.slice/crio-cb62e4661fbadbdd808e2ae3f4d64e1a30f940055a981243a142d3d89147c089 WatchSource:0}: Error finding container cb62e4661fbadbdd808e2ae3f4d64e1a30f940055a981243a142d3d89147c089: Status 404 returned error can't find the container with id cb62e4661fbadbdd808e2ae3f4d64e1a30f940055a981243a142d3d89147c089 Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.800614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.803309 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.818563 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.306650351 +0000 UTC m=+148.448248118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.904371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.904699 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.404670155 +0000 UTC m=+148.546267932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.904892 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.905150 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.405138207 +0000 UTC m=+148.546735984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.007084 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.007490 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.507447077 +0000 UTC m=+148.649044854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.109372 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.109692 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.609676966 +0000 UTC m=+148.751274733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.218629 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.219186 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.719171841 +0000 UTC m=+148.860769618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.321148 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.321700 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.821688048 +0000 UTC m=+148.963285825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.364103 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.424890 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.425385 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.925340305 +0000 UTC m=+149.066938082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.529521 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.529814 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.029803533 +0000 UTC m=+149.171401310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.634582 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.635015 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.135001172 +0000 UTC m=+149.276598939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.635963 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" podStartSLOduration=124.635947168 podStartE2EDuration="2m4.635947168s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.634426777 +0000 UTC m=+148.776024554" watchObservedRunningTime="2026-01-27 11:22:49.635947168 +0000 UTC m=+148.777544945" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.638093 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" podStartSLOduration=124.638085616 podStartE2EDuration="2m4.638085616s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.61172803 +0000 UTC m=+148.753325807" watchObservedRunningTime="2026-01-27 11:22:49.638085616 +0000 UTC m=+148.779683393" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.650122 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.669227 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" podStartSLOduration=124.669210472 podStartE2EDuration="2m4.669210472s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.664968507 +0000 UTC m=+148.806566284" watchObservedRunningTime="2026-01-27 11:22:49.669210472 +0000 UTC m=+148.810808249" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.676304 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5"] Jan 27 11:22:49 crc kubenswrapper[4775]: W0127 11:22:49.693468 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd90473f1_e47f_453c_bbe4_52e528e160de.slice/crio-ff0e762d6faa8aa13134b4ad431bdd12761f70636bd4590d362600a55c9d0450 WatchSource:0}: Error finding container ff0e762d6faa8aa13134b4ad431bdd12761f70636bd4590d362600a55c9d0450: Status 404 returned error can't find the container with id ff0e762d6faa8aa13134b4ad431bdd12761f70636bd4590d362600a55c9d0450 Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.698296 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.722917 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" podStartSLOduration=124.722903471 podStartE2EDuration="2m4.722903471s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.719586941 +0000 UTC m=+148.861184718" watchObservedRunningTime="2026-01-27 11:22:49.722903471 +0000 UTC m=+148.864501248" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.738678 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.739212 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.239196054 +0000 UTC m=+149.380793831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.763017 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-7bkr9" podStartSLOduration=124.762997921 podStartE2EDuration="2m4.762997921s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.753319008 +0000 UTC m=+148.894916785" watchObservedRunningTime="2026-01-27 11:22:49.762997921 +0000 UTC m=+148.904595688" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.786862 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wn6qf" event={"ID":"72a39a16-e53a-42b6-a71f-35d74ef633b6","Type":"ContainerStarted","Data":"aab0402e77c95fd90d528bd770421994fbae156a077fcb6186ba11b2e8e81ce2"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.799734 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" event={"ID":"e7329644-12a0-4c3e-8a2a-2c38a7b78369","Type":"ContainerStarted","Data":"dd754d54437a8c9b48441eb6c302bf775e5bf98c9875ea10a2f0ea56c86c286e"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.809015 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mks6w"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.832802 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" event={"ID":"3e034909-37ed-4437-a799-daf81cbe8241","Type":"ContainerStarted","Data":"6019d33d0f2e16302611a14b4766602cebac6a9e68651e74db247d66645d97c6"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.833432 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.840210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.841277 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.341250728 +0000 UTC m=+149.482848505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.846199 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" event={"ID":"d90473f1-e47f-453c-bbe4-52e528e160de","Type":"ContainerStarted","Data":"ff0e762d6faa8aa13134b4ad431bdd12761f70636bd4590d362600a55c9d0450"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.847477 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" event={"ID":"ebadab77-f881-4ec4-937f-eef9a677edfe","Type":"ContainerStarted","Data":"bd6b5f43dc9212334d387e31c23f8ad1116aba961301f1a1380ac1c649b990ea"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.847502 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" event={"ID":"ebadab77-f881-4ec4-937f-eef9a677edfe","Type":"ContainerStarted","Data":"59934139f60739147d6e6ca63fcf9b71336cc295171c236b6134faabd271905e"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.854320 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" podStartSLOduration=124.854306622 podStartE2EDuration="2m4.854306622s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.850109068 +0000 UTC m=+148.991706835" watchObservedRunningTime="2026-01-27 11:22:49.854306622 +0000 UTC m=+148.995904399" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.861800 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-p6jjk"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.882078 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dqrtf"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.917335 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" podStartSLOduration=124.917311735 podStartE2EDuration="2m4.917311735s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.901220437 +0000 UTC m=+149.042818214" watchObservedRunningTime="2026-01-27 11:22:49.917311735 +0000 UTC m=+149.058909512" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.923980 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.924033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" event={"ID":"fe363a11-e8c8-4b4d-8401-25ba48ff00e0","Type":"ContainerStarted","Data":"6dfeb0a8789166cf9f0107eecfc21a282656d21cc7daff75f87e4dfdfd2e2fa8"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.937576 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.944796 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.946512 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.446495938 +0000 UTC m=+149.588093715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.947232 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" event={"ID":"3655cf31-d392-485f-ba8c-13ccddbe46e1","Type":"ContainerStarted","Data":"cb62e4661fbadbdd808e2ae3f4d64e1a30f940055a981243a142d3d89147c089"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.956888 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l7rtf"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.975224 4775 generic.go:334] "Generic (PLEG): container finished" podID="86325a44-a87c-4898-90ce-1d402f969d3a" containerID="0c21d0379315e4e1a10c6b8d2a3707db8d5b99085db38635086825a4019e13aa" exitCode=0 Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.975306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" event={"ID":"86325a44-a87c-4898-90ce-1d402f969d3a","Type":"ContainerDied","Data":"0c21d0379315e4e1a10c6b8d2a3707db8d5b99085db38635086825a4019e13aa"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.993323 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cnwdf"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.993370 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hj8rf" event={"ID":"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf","Type":"ContainerStarted","Data":"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.007034 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w97mp"] Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.032649 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" podStartSLOduration=125.032633218 podStartE2EDuration="2m5.032633218s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.009936292 +0000 UTC m=+149.151534069" watchObservedRunningTime="2026-01-27 11:22:50.032633218 +0000 UTC m=+149.174230995" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.033717 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz"] Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.046997 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.048497 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.54848294 +0000 UTC m=+149.690080717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.052833 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" event={"ID":"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61","Type":"ContainerStarted","Data":"d534a92e7558c224edc75b37f9c15862614bac308377ab18c1a232908abe8e2d"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.056251 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" podStartSLOduration=125.05623924 podStartE2EDuration="2m5.05623924s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.054334519 +0000 UTC m=+149.195932296" watchObservedRunningTime="2026-01-27 11:22:50.05623924 +0000 UTC m=+149.197837017" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.059200 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n"] Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.076125 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" event={"ID":"0bdc0fe8-51ba-4939-9220-5f45a846f997","Type":"ContainerStarted","Data":"dc43decedabb77a620e6ce95a67fc6b5608fc67504d35af5da4dd14b39a965a4"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.100232 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" podStartSLOduration=125.100216226 podStartE2EDuration="2m5.100216226s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.099009283 +0000 UTC m=+149.240607060" watchObservedRunningTime="2026-01-27 11:22:50.100216226 +0000 UTC m=+149.241814003" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.120768 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" event={"ID":"2a042536-1621-4dae-8564-a3de61645643","Type":"ContainerStarted","Data":"fd5af5b8d2852bff69629b6a7e084786e4de9ab86ef5dff05021843d235893c8"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.133223 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" podStartSLOduration=125.133209972 podStartE2EDuration="2m5.133209972s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.131953438 +0000 UTC m=+149.273551215" watchObservedRunningTime="2026-01-27 11:22:50.133209972 +0000 UTC m=+149.274807739" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.144144 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" event={"ID":"27ef9f09-90fd-490f-a8b6-912a84eb05c5","Type":"ContainerStarted","Data":"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.144188 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" event={"ID":"27ef9f09-90fd-490f-a8b6-912a84eb05c5","Type":"ContainerStarted","Data":"71f62b9e07cf144d54a44160698a6e892c6a6b7a96fbedaace452d7e78d81f2c"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.144928 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.148019 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.149196 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.649185717 +0000 UTC m=+149.790783494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.158546 4775 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jl5cc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.158589 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.165901 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-wn6qf" podStartSLOduration=5.165884419 podStartE2EDuration="5.165884419s" podCreationTimestamp="2026-01-27 11:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.165594222 +0000 UTC m=+149.307191999" watchObservedRunningTime="2026-01-27 11:22:50.165884419 +0000 UTC m=+149.307482186" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.167718 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" event={"ID":"87a94d4a-7341-4e6c-8194-a2e6832dbb01","Type":"ContainerStarted","Data":"96c61ba0c44a9bdd2148a58561f77800040e6336869ae782384b29c41d5da1cd"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.218185 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" event={"ID":"65491a7a-a22b-4993-aef2-42e752143efd","Type":"ContainerStarted","Data":"0533aa6edd918f14538f576e3deaf63ba4ac74bcea142e8bc767f023163401c0"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.220264 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" podStartSLOduration=125.220253907 podStartE2EDuration="2m5.220253907s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.219821506 +0000 UTC m=+149.361419303" watchObservedRunningTime="2026-01-27 11:22:50.220253907 +0000 UTC m=+149.361851684" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.250678 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.252364 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.752326639 +0000 UTC m=+149.893924416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.252408 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-97tsz" event={"ID":"a9987fd7-5b35-449c-b24a-a38afb77db17","Type":"ContainerStarted","Data":"24273afb2501ce305d2943100fd626f0dbe4e0d66743d1ac20b04568497aa216"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.271090 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" event={"ID":"b1dcaba9-07f4-405b-97bf-4575b0edacc5","Type":"ContainerStarted","Data":"84c9086b4443674bc142c18247c231f240564432ae9e4c00298d1b63b6118922"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.310770 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" podStartSLOduration=126.310747127 podStartE2EDuration="2m6.310747127s" podCreationTimestamp="2026-01-27 11:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.293106977 +0000 UTC m=+149.434704754" watchObservedRunningTime="2026-01-27 11:22:50.310747127 +0000 UTC m=+149.452344904" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.333886 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" event={"ID":"05079674-b89f-4310-98f0-b39caf8f6189","Type":"ContainerStarted","Data":"2db66dd87ec8ce293ea30d0e9e985ac04c1af6ff3ebd7a1d807aa3274f8678f4"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.343960 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hj8rf" podStartSLOduration=125.343938969 podStartE2EDuration="2m5.343938969s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.335416138 +0000 UTC m=+149.477013925" watchObservedRunningTime="2026-01-27 11:22:50.343938969 +0000 UTC m=+149.485536736" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.353052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.354247 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.854232799 +0000 UTC m=+149.995830576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.371069 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" event={"ID":"70e56eaf-e2b2-4431-988a-e39e37012771","Type":"ContainerStarted","Data":"9d72bf496b407b7649878b26f66112aa99674c59d371c19c52804c420f8311c7"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.372514 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-7bkr9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.372583 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7bkr9" podUID="9ad82a99-23f4-4f61-9fa9-535b29e11fc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.380992 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.390732 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" podStartSLOduration=125.3907159 podStartE2EDuration="2m5.3907159s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.387963905 +0000 UTC m=+149.529561692" watchObservedRunningTime="2026-01-27 11:22:50.3907159 +0000 UTC m=+149.532313677" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.401299 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.428891 4775 csr.go:261] certificate signing request csr-2zzsd is approved, waiting to be issued Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.448505 4775 csr.go:257] certificate signing request csr-2zzsd is issued Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.453918 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.455222 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.955207013 +0000 UTC m=+150.096804790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.530779 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" podStartSLOduration=125.530763246 podStartE2EDuration="2m5.530763246s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.461688549 +0000 UTC m=+149.603286326" watchObservedRunningTime="2026-01-27 11:22:50.530763246 +0000 UTC m=+149.672361013" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.558089 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.558396 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.058385397 +0000 UTC m=+150.199983174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.574996 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" podStartSLOduration=125.574980078 podStartE2EDuration="2m5.574980078s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.532681268 +0000 UTC m=+149.674279045" watchObservedRunningTime="2026-01-27 11:22:50.574980078 +0000 UTC m=+149.716577855" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.646003 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-97tsz" podStartSLOduration=125.645986297 podStartE2EDuration="2m5.645986297s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.638077893 +0000 UTC m=+149.779675680" watchObservedRunningTime="2026-01-27 11:22:50.645986297 +0000 UTC m=+149.787584074" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.660187 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.660617 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.160598625 +0000 UTC m=+150.302196402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.707428 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" podStartSLOduration=125.707413237 podStartE2EDuration="2m5.707413237s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.706288836 +0000 UTC m=+149.847886633" watchObservedRunningTime="2026-01-27 11:22:50.707413237 +0000 UTC m=+149.849011014" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.743820 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" podStartSLOduration=125.743803245 podStartE2EDuration="2m5.743803245s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.743475167 +0000 UTC m=+149.885072944" watchObservedRunningTime="2026-01-27 11:22:50.743803245 +0000 UTC m=+149.885401022" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.761962 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.762243 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.262231376 +0000 UTC m=+150.403829153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.819409 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" podStartSLOduration=125.81939292 podStartE2EDuration="2m5.81939292s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.817612262 +0000 UTC m=+149.959210029" watchObservedRunningTime="2026-01-27 11:22:50.81939292 +0000 UTC m=+149.960990697" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.864683 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.865004 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.364986479 +0000 UTC m=+150.506584256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.966922 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.967226 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.467212927 +0000 UTC m=+150.608810704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.977601 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.983580 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:50 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:50 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:50 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.983748 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.068806 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.070399 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.570381701 +0000 UTC m=+150.711979478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.172197 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.172812 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.672784514 +0000 UTC m=+150.814382291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.273932 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.274207 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.774167079 +0000 UTC m=+150.915764856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.274566 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.275123 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.775104605 +0000 UTC m=+150.916702382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.383263 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.383637 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.883618513 +0000 UTC m=+151.025216290 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.415705 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" event={"ID":"2a042536-1621-4dae-8564-a3de61645643","Type":"ContainerStarted","Data":"528d08f059ce3d8b2e449d87760e8588f3bea2f0cd1eecf6bd9cb4d114b6ea60"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.415758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" event={"ID":"2a042536-1621-4dae-8564-a3de61645643","Type":"ContainerStarted","Data":"4b17f1bf2ac6e8c7c305aaea8d2a8d4fe1c6f78427701ff7bf826500091a6b39"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.439256 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" event={"ID":"c5daf300-a879-408f-a78a-c70b0e77f54c","Type":"ContainerStarted","Data":"de42c2e202f71ffbc786ebc358dd070b6c1a2e4b64da6a0f87d192361561cdad"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.439302 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" event={"ID":"c5daf300-a879-408f-a78a-c70b0e77f54c","Type":"ContainerStarted","Data":"2f1a6cbdb6ba15deef2e45e6126623f63cdccd0d29901ee45dc4a65925f5f5aa"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.440071 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.451480 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-27 11:17:50 +0000 UTC, rotation deadline is 2026-12-17 12:26:54.440365864 +0000 UTC Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.451581 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7777h4m2.988788713s for next certificate rotation Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.480920 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" podStartSLOduration=126.480905888 podStartE2EDuration="2m6.480905888s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.478631016 +0000 UTC m=+150.620228793" watchObservedRunningTime="2026-01-27 11:22:51.480905888 +0000 UTC m=+150.622503665" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.485303 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.488629 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.988617557 +0000 UTC m=+151.130215334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.492914 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" event={"ID":"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd","Type":"ContainerStarted","Data":"43b6f8605a4e7694ccbd8aecd38778b552b9690904788d0df5f7ed221d595dd8"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.525625 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" podStartSLOduration=126.525606553 podStartE2EDuration="2m6.525606553s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.517833421 +0000 UTC m=+150.659431198" watchObservedRunningTime="2026-01-27 11:22:51.525606553 +0000 UTC m=+150.667204330" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.531665 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dqrtf" event={"ID":"06a3e92f-cb64-4857-8e1a-4da128f94f55","Type":"ContainerStarted","Data":"a1f77a6fb9d108229dd2b40a0bd7b77e59d4a5aa01a11175087fc6c7550fd453"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.531701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dqrtf" event={"ID":"06a3e92f-cb64-4857-8e1a-4da128f94f55","Type":"ContainerStarted","Data":"db5eb3f5c96a180043b72a5496cb4347932089ad2fcb0cd3cbe05aa5447d84c2"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.548890 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.581096 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" event={"ID":"9632ab24-73c1-4940-a642-482850dc4fe4","Type":"ContainerStarted","Data":"12374620bc7d271a79c62cb620cefbdc0f810ebc145076a7b886613bef5c69c7"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.581142 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" event={"ID":"9632ab24-73c1-4940-a642-482850dc4fe4","Type":"ContainerStarted","Data":"80b0a4e92569f9e147b3229daf7f0ffe4f2659861e0f4244c02bce07ec4ef696"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.581151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" event={"ID":"9632ab24-73c1-4940-a642-482850dc4fe4","Type":"ContainerStarted","Data":"5a28e03b68a57f45bf360daf44711f1e6ac6505fa52536f50c1eb227f666bf14"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.581723 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.586983 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.587305 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.087289559 +0000 UTC m=+151.228887336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.643512 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" event={"ID":"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61","Type":"ContainerStarted","Data":"eee1a9656c2f41d432bbf844e4977c11a157f7c30e4e81da4d63ed75788879cc"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.655731 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" podStartSLOduration=126.655714058 podStartE2EDuration="2m6.655714058s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.653054336 +0000 UTC m=+150.794652113" watchObservedRunningTime="2026-01-27 11:22:51.655714058 +0000 UTC m=+150.797311835" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.658206 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cnwdf" event={"ID":"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9","Type":"ContainerStarted","Data":"9b30a015543d93b1a33b63a65f0a742a246f4327aeef0d453c6e13a62cff9288"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.658250 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cnwdf" event={"ID":"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9","Type":"ContainerStarted","Data":"4b07e67c0f4e51027b9d8114576e4d16b08e24ab63fe1e79121135e3421f5350"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.690503 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.691694 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.191681786 +0000 UTC m=+151.333279563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.699970 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" event={"ID":"5cf81fd9-7041-48eb-acff-470663fc9987","Type":"ContainerStarted","Data":"8107c7b877227e25164b7d0d1a366a18d1fa4c97d2dc74d6468dd892a5d56cbb"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.700013 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" event={"ID":"5cf81fd9-7041-48eb-acff-470663fc9987","Type":"ContainerStarted","Data":"31ae6e7018fcf1e9a9c974ec56e22d7f27ef67f348440c9e0d446abf85ad35b7"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.700732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.705306 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cnwdf" podStartSLOduration=7.705293656 podStartE2EDuration="7.705293656s" podCreationTimestamp="2026-01-27 11:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.704237667 +0000 UTC m=+150.845835444" watchObservedRunningTime="2026-01-27 11:22:51.705293656 +0000 UTC m=+150.846891423" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.706857 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" event={"ID":"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e","Type":"ContainerStarted","Data":"b209e173952f26f4dc314786df4e9b28d480467980940127fbe701e2e2c30f0a"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.706900 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" event={"ID":"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e","Type":"ContainerStarted","Data":"913c5c74a03be31c906de368eab588682face76ab8ed48f102ae5349ad389a37"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.707732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.713009 4775 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-c4826 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.713061 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" podUID="5cf81fd9-7041-48eb-acff-470663fc9987" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.715415 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" event={"ID":"27d889ae-fa92-40b8-800d-d61fb92d618d","Type":"ContainerStarted","Data":"da7d86930bccd9656409d12cb3e639ea3e4064717d262f6e0b687c01a82d4d7c"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.715440 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" event={"ID":"27d889ae-fa92-40b8-800d-d61fb92d618d","Type":"ContainerStarted","Data":"c7dd8f22cd9ccccf950c2cb368f466450b2b0d0d5aaf13383aeb86f052bfbf97"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.716232 4775 patch_prober.go:28] interesting pod/console-operator-58897d9998-p6jjk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.716259 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" podUID="ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.716560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" event={"ID":"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8","Type":"ContainerStarted","Data":"36136d3a0eb9a873346fbbde8eb88c862390f091561d2a899de16ee76cc45b6d"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.716587 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" event={"ID":"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8","Type":"ContainerStarted","Data":"ad63fa9301fa0e7d96096d745998f27df39fc9132ccaa4783c77839cd75aa285"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.718277 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" event={"ID":"b1dcaba9-07f4-405b-97bf-4575b0edacc5","Type":"ContainerStarted","Data":"b51e6b91945a903237ff01650c0d16cfe3ac0054bc8d395893e71d66851fcdd6"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.769836 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" event={"ID":"05079674-b89f-4310-98f0-b39caf8f6189","Type":"ContainerStarted","Data":"3c210595a983f9f135eadad00e5c5b4d9fcdde9142f1ddc7ca78135733655eeb"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.769870 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" event={"ID":"04906ea0-5e8b-4e8b-8f20-c46587da8346","Type":"ContainerStarted","Data":"bee43132c84a9e322e462c0d4b4b214665e4a0e6c90cb849008c237820eb6817"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.776191 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" podStartSLOduration=126.776178803 podStartE2EDuration="2m6.776178803s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.764941347 +0000 UTC m=+150.906539124" watchObservedRunningTime="2026-01-27 11:22:51.776178803 +0000 UTC m=+150.917776580" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.780224 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" event={"ID":"3655cf31-d392-485f-ba8c-13ccddbe46e1","Type":"ContainerStarted","Data":"20ed37cb270f3e63f0e3af77729d40f3065657e5d2390f476b3b2afaf3889e16"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.786770 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" event={"ID":"68158dce-8840-47f8-8dac-37abc28edc74","Type":"ContainerStarted","Data":"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.787752 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.792257 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.794354 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.294331255 +0000 UTC m=+151.435929032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.806680 4775 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-krl46 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.806738 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" podUID="68158dce-8840-47f8-8dac-37abc28edc74" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.812617 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" event={"ID":"91f00de9-b734-4644-9164-b4b6c990aeb3","Type":"ContainerStarted","Data":"b1c85aa7a3fdc901b21d61123d0c8de3b6683b00c3f63b3a77635318b27457da"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.812657 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" event={"ID":"91f00de9-b734-4644-9164-b4b6c990aeb3","Type":"ContainerStarted","Data":"37ccad49aefbf211461d72e8ca18ffca80c3d6df8cfe9ff2c445094ed5935b89"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.822305 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" event={"ID":"02e25ab4-d6d1-40f7-8c8c-3920620cfb98","Type":"ContainerStarted","Data":"a69523693d53de97ec0c0446bafc9645aa373df1a002ade301bcfdd030ff8051"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.847643 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" podStartSLOduration=126.847626104 podStartE2EDuration="2m6.847626104s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.845480315 +0000 UTC m=+150.987078092" watchObservedRunningTime="2026-01-27 11:22:51.847626104 +0000 UTC m=+150.989223881" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.853808 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" event={"ID":"d90473f1-e47f-453c-bbe4-52e528e160de","Type":"ContainerStarted","Data":"015f40b86d1688cba57d7c70e75205c818e9ded18ee1fff0c16f1af8420d6cb7"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.878292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" event={"ID":"65491a7a-a22b-4993-aef2-42e752143efd","Type":"ContainerStarted","Data":"3b819ad549d3512db136f84299cb077290603b03ab7d6936e0f18fdfd4c7a772"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.878337 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" event={"ID":"65491a7a-a22b-4993-aef2-42e752143efd","Type":"ContainerStarted","Data":"ddd30e319025170daeca1ab0eef175d79bb88f40b4e3e7788b9f7d3419069f86"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.894313 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.894620 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.39460691 +0000 UTC m=+151.536204687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.898415 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" event={"ID":"c13ee778-6aa2-4c33-92f6-1bddaadc2f82","Type":"ContainerStarted","Data":"529e6bd8040e6bdb7cbb3acffbab7bc9591d02a0cc51b16f12950ee913d081ec"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.898464 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" event={"ID":"c13ee778-6aa2-4c33-92f6-1bddaadc2f82","Type":"ContainerStarted","Data":"5be4fed82fb5e89e14a93619decf3059ba064286bddf2781ef54adefa1ec4520"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.900176 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" event={"ID":"03843cd3-d8c8-4007-b9d5-c1d2254c1677","Type":"ContainerStarted","Data":"660bea006c33bdb516d36c9db325277b958f06b70d3e022444e3efefb0d887b9"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.900196 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" event={"ID":"03843cd3-d8c8-4007-b9d5-c1d2254c1677","Type":"ContainerStarted","Data":"ec136e7681689ccafbb0ad9b7c9d01830a866f7e87a5ee742bd42f1dae6c13c1"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.900745 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.920028 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" podStartSLOduration=126.920013901 podStartE2EDuration="2m6.920013901s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.917867073 +0000 UTC m=+151.059464850" watchObservedRunningTime="2026-01-27 11:22:51.920013901 +0000 UTC m=+151.061611678" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.922134 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" event={"ID":"86325a44-a87c-4898-90ce-1d402f969d3a","Type":"ContainerStarted","Data":"23c3b08f08e63d195f100b793de22fc9e44e05a4e327d2b6a4565f96a1e3d31c"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.924930 4775 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9s82p container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.924970 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" podUID="03843cd3-d8c8-4007-b9d5-c1d2254c1677" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.971043 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.972200 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.994951 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.996824 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.496807618 +0000 UTC m=+151.638405395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:51.999805 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:52 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:52 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:52 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.016026 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.106217 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.107327 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.607313701 +0000 UTC m=+151.748911478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.136805 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" podStartSLOduration=127.136788972 podStartE2EDuration="2m7.136788972s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.0675069 +0000 UTC m=+151.209104677" watchObservedRunningTime="2026-01-27 11:22:52.136788972 +0000 UTC m=+151.278386749" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.184529 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" podStartSLOduration=127.184512369 podStartE2EDuration="2m7.184512369s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.137766829 +0000 UTC m=+151.279364606" watchObservedRunningTime="2026-01-27 11:22:52.184512369 +0000 UTC m=+151.326110146" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.186314 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" podStartSLOduration=127.186306567 podStartE2EDuration="2m7.186306567s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.185673511 +0000 UTC m=+151.327271288" watchObservedRunningTime="2026-01-27 11:22:52.186306567 +0000 UTC m=+151.327904344" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.207866 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.208554 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.708539642 +0000 UTC m=+151.850137419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.283932 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.284707 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.286844 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" podStartSLOduration=127.28683579 podStartE2EDuration="2m7.28683579s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.281869605 +0000 UTC m=+151.423467392" watchObservedRunningTime="2026-01-27 11:22:52.28683579 +0000 UTC m=+151.428433567" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.309263 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.309623 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.809610729 +0000 UTC m=+151.951208506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.316679 4775 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-m7xvw container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.21:8443/livez\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.316827 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" podUID="86325a44-a87c-4898-90ce-1d402f969d3a" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.21:8443/livez\": dial tcp 10.217.0.21:8443: connect: connection refused" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.324418 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" podStartSLOduration=127.324405081 podStartE2EDuration="2m7.324405081s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.32289814 +0000 UTC m=+151.464495917" watchObservedRunningTime="2026-01-27 11:22:52.324405081 +0000 UTC m=+151.466002858" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.402660 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.410818 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.411325 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.911303882 +0000 UTC m=+152.052901659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.432223 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" podStartSLOduration=127.43220419 podStartE2EDuration="2m7.43220419s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.403274264 +0000 UTC m=+151.544872041" watchObservedRunningTime="2026-01-27 11:22:52.43220419 +0000 UTC m=+151.573801967" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.472716 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" podStartSLOduration=127.472702231 podStartE2EDuration="2m7.472702231s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.433153397 +0000 UTC m=+151.574751184" watchObservedRunningTime="2026-01-27 11:22:52.472702231 +0000 UTC m=+151.614300008" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.512382 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.512728 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.012717359 +0000 UTC m=+152.154315136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.515946 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" podStartSLOduration=127.515930026 podStartE2EDuration="2m7.515930026s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.513956212 +0000 UTC m=+151.655553999" watchObservedRunningTime="2026-01-27 11:22:52.515930026 +0000 UTC m=+151.657527803" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.614763 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.614955 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.615029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.615705 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.115679247 +0000 UTC m=+152.257277024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.616641 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.646955 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.691686 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.716734 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.716842 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.716886 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.719860 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.219839908 +0000 UTC m=+152.361437685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.721911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.725422 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.818371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.818728 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.318683694 +0000 UTC m=+152.460281471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.920037 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.920616 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.420590783 +0000 UTC m=+152.562188560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.978771 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.981263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dqrtf" event={"ID":"06a3e92f-cb64-4857-8e1a-4da128f94f55","Type":"ContainerStarted","Data":"9705c2b3fefc46caeeb544d7b64cdba84f896dd91fcbe3088799eebeaf53357c"} Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.981589 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.995759 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:52 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:52 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:52 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.995805 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.999584 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.002286 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.021222 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.023553 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.52352934 +0000 UTC m=+152.665127117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.027310 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dqrtf" podStartSLOduration=9.027289853 podStartE2EDuration="9.027289853s" podCreationTimestamp="2026-01-27 11:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:53.02239687 +0000 UTC m=+152.163994647" watchObservedRunningTime="2026-01-27 11:22:53.027289853 +0000 UTC m=+152.168887630" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.054249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" event={"ID":"91f00de9-b734-4644-9164-b4b6c990aeb3","Type":"ContainerStarted","Data":"022a60d70c8f489f04235d83dda5d9802ffca3a71bdbb75b0a68c22e8fbbba57"} Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.123982 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.124350 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.62433105 +0000 UTC m=+152.765928827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.144172 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" event={"ID":"27d889ae-fa92-40b8-800d-d61fb92d618d","Type":"ContainerStarted","Data":"21123bb550dca958669ca6ce3d94621b6e5742881b4df027084b04904262a607"} Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.175788 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" podStartSLOduration=128.175752958 podStartE2EDuration="2m8.175752958s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:53.095974049 +0000 UTC m=+152.237571826" watchObservedRunningTime="2026-01-27 11:22:53.175752958 +0000 UTC m=+152.317350735" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.184755 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" event={"ID":"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd","Type":"ContainerStarted","Data":"06729a94d59fe2bf7b82240ae98f53ecb463942d56a2cbc8655b6705ea1ae5f2"} Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.193915 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" podStartSLOduration=128.193900641 podStartE2EDuration="2m8.193900641s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:53.176594291 +0000 UTC m=+152.318192078" watchObservedRunningTime="2026-01-27 11:22:53.193900641 +0000 UTC m=+152.335498428" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.205231 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.226550 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.229904 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.729880779 +0000 UTC m=+152.871478546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.237977 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.238298 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.348750 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.349073 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.849042817 +0000 UTC m=+152.990640584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.450392 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.451331 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.951308016 +0000 UTC m=+153.092905793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.552038 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.552359 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.052346812 +0000 UTC m=+153.193944589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.655323 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.655605 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.155578048 +0000 UTC m=+153.297175825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: W0127 11:22:53.699525 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-cd9f9455686d54dc8c2a67a9032272c866352fbe2bdcacdab2bad6f70f4ec008 WatchSource:0}: Error finding container cd9f9455686d54dc8c2a67a9032272c866352fbe2bdcacdab2bad6f70f4ec008: Status 404 returned error can't find the container with id cd9f9455686d54dc8c2a67a9032272c866352fbe2bdcacdab2bad6f70f4ec008 Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.735171 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.762605 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.762912 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.262896564 +0000 UTC m=+153.404494341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: W0127 11:22:53.819554 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-c7eb1de85289600e7dc67dd2c61b76ad9568b725556777489850f1e005130d70 WatchSource:0}: Error finding container c7eb1de85289600e7dc67dd2c61b76ad9568b725556777489850f1e005130d70: Status 404 returned error can't find the container with id c7eb1de85289600e7dc67dd2c61b76ad9568b725556777489850f1e005130d70 Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.864728 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.865085 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.36506909 +0000 UTC m=+153.506666867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.966184 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.966493 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.466481607 +0000 UTC m=+153.608079384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.975555 4775 patch_prober.go:28] interesting pod/apiserver-76f77b778f-zcbc6 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]log ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]etcd ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/generic-apiserver-start-informers ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/max-in-flight-filter ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 27 11:22:53 crc kubenswrapper[4775]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 27 11:22:53 crc kubenswrapper[4775]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/project.openshift.io-projectcache ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/openshift.io-startinformers ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 27 11:22:53 crc kubenswrapper[4775]: livez check failed Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.975628 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" podUID="c13ee778-6aa2-4c33-92f6-1bddaadc2f82" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.985030 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:53 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:53 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:53 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.985077 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.067890 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.068279 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.568262653 +0000 UTC m=+153.709860430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.169443 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.169764 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.669748501 +0000 UTC m=+153.811346268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.194197 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" event={"ID":"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd","Type":"ContainerStarted","Data":"1cd6906c2a7f11739cfe596d63f02ade73f04a959b79aab945acb154b74a4a6f"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.194321 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" event={"ID":"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd","Type":"ContainerStarted","Data":"2b37a4e5846ffcc455f9af22719def2e84bb9449d85c1bd8d80710adfa29facd"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.195331 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1a915e56e5ae74e56c915dd2921b645a0311bb1185597fdf6c5424ab11c0820e"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.195439 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"87314c930b0861c7348bfa1221b4b44c849ad00c8621536a26eaeab571b0aae8"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.196460 4775 generic.go:334] "Generic (PLEG): container finished" podID="04906ea0-5e8b-4e8b-8f20-c46587da8346" containerID="bee43132c84a9e322e462c0d4b4b214665e4a0e6c90cb849008c237820eb6817" exitCode=0 Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.196535 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" event={"ID":"04906ea0-5e8b-4e8b-8f20-c46587da8346","Type":"ContainerDied","Data":"bee43132c84a9e322e462c0d4b4b214665e4a0e6c90cb849008c237820eb6817"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.197662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c6f6919fe2ebe16c3cbcfaff7c1dbde15471ff43e9e94ebc6eed87bd1e425780"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.197700 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c7eb1de85289600e7dc67dd2c61b76ad9568b725556777489850f1e005130d70"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.200096 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2fd34dd2b279752de359aee68435a5a849a7bcf3be93777a43c64852b07afe88"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.200151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cd9f9455686d54dc8c2a67a9032272c866352fbe2bdcacdab2bad6f70f4ec008"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.270436 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.270556 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.77053373 +0000 UTC m=+153.912131497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.270782 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.272601 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.772585195 +0000 UTC m=+153.914182972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.321768 4775 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.327129 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s8snw"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.328024 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.329854 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.339419 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8snw"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.375313 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.375432 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.87540814 +0000 UTC m=+154.017005917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.375570 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.375603 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.375636 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.376210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-962qq\" (UniqueName: \"kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.376254 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.876237973 +0000 UTC m=+154.017835750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.477741 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.477986 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.478044 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-962qq\" (UniqueName: \"kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.478081 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.478268 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.978238914 +0000 UTC m=+154.119836691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.478520 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.478599 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.502716 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-962qq\" (UniqueName: \"kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.523857 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vkb7p"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.524926 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.526953 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.537151 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkb7p"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.579573 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.579645 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qsks\" (UniqueName: \"kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.579673 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.579724 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.580141 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:55.080123723 +0000 UTC m=+154.221721500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.646305 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.681074 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.681316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.681349 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qsks\" (UniqueName: \"kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.681375 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.681838 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.681915 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:55.18189829 +0000 UTC m=+154.323496067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.682104 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.698636 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qsks\" (UniqueName: \"kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.723667 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.724602 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.738798 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.784552 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.784600 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.784623 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.784696 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfzg9\" (UniqueName: \"kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.784971 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:55.28495967 +0000 UTC m=+154.426557447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.840121 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.888569 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.888764 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:55.388738341 +0000 UTC m=+154.530336118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.888790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfzg9\" (UniqueName: \"kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.888843 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.888869 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.888892 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.889476 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:55.38946717 +0000 UTC m=+154.531064947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.889664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.889739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.908037 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfzg9\" (UniqueName: \"kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.924908 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.926031 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.929314 4775 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-27T11:22:54.321796003Z","Handler":null,"Name":""} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.931638 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.933708 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8snw"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.940251 4775 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.940325 4775 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.983164 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:54 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:54 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:54 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.983597 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.989702 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.989981 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.990016 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.990044 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z4x5\" (UniqueName: \"kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.996520 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.057773 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.094053 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.094107 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.094136 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z4x5\" (UniqueName: \"kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.094223 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.094790 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.095028 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.103294 4775 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.103336 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.117244 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z4x5\" (UniqueName: \"kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.134093 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.161542 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.216414 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" event={"ID":"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd","Type":"ContainerStarted","Data":"59805f77353717ff12dee61f4aae7e868e50fe076e3edfb7635250eae886c87b"} Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.223054 4775 generic.go:334] "Generic (PLEG): container finished" podID="2b487540-88bb-496a-9aff-3f383cdc858b" containerID="d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05" exitCode=0 Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.224497 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerDied","Data":"d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05"} Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.224542 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerStarted","Data":"1eec3f7497774ba660fe56e1601efacc89958991dbb3752466e04ed907d8b155"} Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.228840 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.247381 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.255022 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" podStartSLOduration=10.255001354000001 podStartE2EDuration="10.255001354s" podCreationTimestamp="2026-01-27 11:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:55.236659066 +0000 UTC m=+154.378256863" watchObservedRunningTime="2026-01-27 11:22:55.255001354 +0000 UTC m=+154.396599131" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.265413 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:22:55 crc kubenswrapper[4775]: W0127 11:22:55.280608 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a822f4_b93b_497d_bfc6_cf4f13cc8140.slice/crio-290297c00f0444d9d550e5200aba7133d2e252d2cc5cddaaa7f26158bf4b0fff WatchSource:0}: Error finding container 290297c00f0444d9d550e5200aba7133d2e252d2cc5cddaaa7f26158bf4b0fff: Status 404 returned error can't find the container with id 290297c00f0444d9d550e5200aba7133d2e252d2cc5cddaaa7f26158bf4b0fff Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.319303 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkb7p"] Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.425529 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.426162 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.428877 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.429377 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.438128 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.511160 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.511223 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.589619 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:22:55 crc kubenswrapper[4775]: W0127 11:22:55.611471 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb40aba_c103_4a72_abd7_3e5b3aaa82e5.slice/crio-8d37a2d435548adc351dbcf45235ea8b83864719085f8dffa0da9c361fa7f477 WatchSource:0}: Error finding container 8d37a2d435548adc351dbcf45235ea8b83864719085f8dffa0da9c361fa7f477: Status 404 returned error can't find the container with id 8d37a2d435548adc351dbcf45235ea8b83864719085f8dffa0da9c361fa7f477 Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.613424 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.614223 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.614712 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.616885 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.644109 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.715023 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjv2j\" (UniqueName: \"kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j\") pod \"04906ea0-5e8b-4e8b-8f20-c46587da8346\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.715118 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume\") pod \"04906ea0-5e8b-4e8b-8f20-c46587da8346\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.715149 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume\") pod \"04906ea0-5e8b-4e8b-8f20-c46587da8346\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.716265 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume" (OuterVolumeSpecName: "config-volume") pod "04906ea0-5e8b-4e8b-8f20-c46587da8346" (UID: "04906ea0-5e8b-4e8b-8f20-c46587da8346"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.717324 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.718928 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j" (OuterVolumeSpecName: "kube-api-access-vjv2j") pod "04906ea0-5e8b-4e8b-8f20-c46587da8346" (UID: "04906ea0-5e8b-4e8b-8f20-c46587da8346"). InnerVolumeSpecName "kube-api-access-vjv2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.722781 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "04906ea0-5e8b-4e8b-8f20-c46587da8346" (UID: "04906ea0-5e8b-4e8b-8f20-c46587da8346"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:22:55 crc kubenswrapper[4775]: W0127 11:22:55.726150 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a37ccd_52c3_49cc_8db8_1f0069dee3c3.slice/crio-6ed253dee07a34146473ff3556fd2212e703a1c328ef425adac33fe2a7fe4fa8 WatchSource:0}: Error finding container 6ed253dee07a34146473ff3556fd2212e703a1c328ef425adac33fe2a7fe4fa8: Status 404 returned error can't find the container with id 6ed253dee07a34146473ff3556fd2212e703a1c328ef425adac33fe2a7fe4fa8 Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.752344 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.759647 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.817439 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjv2j\" (UniqueName: \"kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j\") on node \"crc\" DevicePath \"\"" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.817663 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.817723 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.978740 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:55 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:55 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:55 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.979133 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.981113 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 11:22:55 crc kubenswrapper[4775]: W0127 11:22:55.984757 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod41348a87_6415_41fe_97a9_bcc552d7bc8e.slice/crio-53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336 WatchSource:0}: Error finding container 53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336: Status 404 returned error can't find the container with id 53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336 Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.232845 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" event={"ID":"04906ea0-5e8b-4e8b-8f20-c46587da8346","Type":"ContainerDied","Data":"cf78fc6ef9d230c40aed4d7f6b98059ee501a89f8054d5de9225a945bf0f0a69"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.233213 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf78fc6ef9d230c40aed4d7f6b98059ee501a89f8054d5de9225a945bf0f0a69" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.233291 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.235150 4775 generic.go:334] "Generic (PLEG): container finished" podID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerID="13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689" exitCode=0 Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.235200 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerDied","Data":"13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.235217 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerStarted","Data":"6ed253dee07a34146473ff3556fd2212e703a1c328ef425adac33fe2a7fe4fa8"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.238940 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41348a87-6415-41fe-97a9-bcc552d7bc8e","Type":"ContainerStarted","Data":"53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.243383 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" event={"ID":"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5","Type":"ContainerStarted","Data":"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.243417 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" event={"ID":"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5","Type":"ContainerStarted","Data":"8d37a2d435548adc351dbcf45235ea8b83864719085f8dffa0da9c361fa7f477"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.243478 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.246801 4775 generic.go:334] "Generic (PLEG): container finished" podID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerID="e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371" exitCode=0 Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.246873 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerDied","Data":"e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.247421 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerStarted","Data":"290297c00f0444d9d550e5200aba7133d2e252d2cc5cddaaa7f26158bf4b0fff"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.254205 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerDied","Data":"575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.254251 4775 generic.go:334] "Generic (PLEG): container finished" podID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerID="575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1" exitCode=0 Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.254529 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerStarted","Data":"5e3718fa7769c29d58e7ea6f7af42eff70181f72f7af0705859deb32581a0268"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.295784 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" podStartSLOduration=131.295768258 podStartE2EDuration="2m11.295768258s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:56.292989753 +0000 UTC m=+155.434587530" watchObservedRunningTime="2026-01-27 11:22:56.295768258 +0000 UTC m=+155.437366035" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.523556 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9"] Jan 27 11:22:56 crc kubenswrapper[4775]: E0127 11:22:56.524077 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04906ea0-5e8b-4e8b-8f20-c46587da8346" containerName="collect-profiles" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.524177 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="04906ea0-5e8b-4e8b-8f20-c46587da8346" containerName="collect-profiles" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.524383 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="04906ea0-5e8b-4e8b-8f20-c46587da8346" containerName="collect-profiles" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.526134 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.527990 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.534472 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9"] Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.636672 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmb78\" (UniqueName: \"kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.636740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.636762 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.738295 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmb78\" (UniqueName: \"kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.738567 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.738602 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.739154 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.739298 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.761648 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmb78\" (UniqueName: \"kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.854489 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.922336 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.923415 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.931325 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.975021 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.978752 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:56 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:56 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:56 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.978798 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.979124 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.047163 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqkzd\" (UniqueName: \"kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.047218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.047248 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.151659 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqkzd\" (UniqueName: \"kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.151706 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.151740 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.152149 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.152654 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.220334 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqkzd\" (UniqueName: \"kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.254718 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.263235 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-7bkr9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.263282 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7bkr9" podUID="9ad82a99-23f4-4f61-9fa9-535b29e11fc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.263323 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-7bkr9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.263362 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7bkr9" podUID="9ad82a99-23f4-4f61-9fa9-535b29e11fc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.271289 4775 generic.go:334] "Generic (PLEG): container finished" podID="41348a87-6415-41fe-97a9-bcc552d7bc8e" containerID="bc1e9cdd7d4a6a3a9ed8e7ee85f8174c8273fced2b790ad4cf14063463698b52" exitCode=0 Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.272054 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41348a87-6415-41fe-97a9-bcc552d7bc8e","Type":"ContainerDied","Data":"bc1e9cdd7d4a6a3a9ed8e7ee85f8174c8273fced2b790ad4cf14063463698b52"} Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.287019 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.294669 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.309732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.533606 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v5q62"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.535117 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.540095 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v5q62"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.540888 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.559928 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.559967 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7htj4\" (UniqueName: \"kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.560004 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.600928 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.600983 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.615044 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.622199 4775 patch_prober.go:28] interesting pod/console-f9d7485db-hj8rf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.622270 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hj8rf" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.661390 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.661428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7htj4\" (UniqueName: \"kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.661481 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.662972 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.663524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.680328 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7htj4\" (UniqueName: \"kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.864408 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.925815 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.928048 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.933884 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.979670 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.979736 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.979800 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpls5\" (UniqueName: \"kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.979898 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.990658 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:57 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:57 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:57 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.990727 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.081244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpls5\" (UniqueName: \"kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.081309 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.081351 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.082888 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.083041 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.107412 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpls5\" (UniqueName: \"kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.242622 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.281868 4775 generic.go:334] "Generic (PLEG): container finished" podID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerID="15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae" exitCode=0 Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.281971 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerDied","Data":"15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae"} Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.281998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerStarted","Data":"67afb1f7244fe2812d1d4acb97266d9ec82321b9084603e7c3e8b7b7b66acb18"} Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.286172 4775 generic.go:334] "Generic (PLEG): container finished" podID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerID="b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e" exitCode=0 Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.286270 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerDied","Data":"b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e"} Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.286299 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerStarted","Data":"f7dc6e40e63c860fc724ef492981f5e211c90e6c7db158d9132d52f25b456767"} Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.381344 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v5q62"] Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.659200 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.660121 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.662488 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.662552 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.664229 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.684376 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.692122 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.692174 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.698916 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.793398 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access\") pod \"41348a87-6415-41fe-97a9-bcc552d7bc8e\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.793476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir\") pod \"41348a87-6415-41fe-97a9-bcc552d7bc8e\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.793586 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "41348a87-6415-41fe-97a9-bcc552d7bc8e" (UID: "41348a87-6415-41fe-97a9-bcc552d7bc8e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.793809 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.793966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.795132 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.795349 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.799412 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "41348a87-6415-41fe-97a9-bcc552d7bc8e" (UID: "41348a87-6415-41fe-97a9-bcc552d7bc8e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.809405 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.897427 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.978890 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:58 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:58 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:58 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.978954 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.006876 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.298222 4775 generic.go:334] "Generic (PLEG): container finished" podID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerID="d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196" exitCode=0 Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.298534 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerDied","Data":"d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196"} Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.298559 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerStarted","Data":"aada0f1adaa2b58806b9e0dc31f109b054a31ac70cb0eb0272c44c192348a37d"} Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.308572 4775 generic.go:334] "Generic (PLEG): container finished" podID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerID="8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4" exitCode=0 Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.308658 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerDied","Data":"8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4"} Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.308688 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerStarted","Data":"03e04acda80c448d05ec4f5110391d0366fd0fc80d319a5ee3f1ee1c23fc4573"} Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.317395 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41348a87-6415-41fe-97a9-bcc552d7bc8e","Type":"ContainerDied","Data":"53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336"} Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.317421 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336" Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.317500 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.347321 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.264141 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.264216 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.273740 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:23:00 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:23:00 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:23:00 crc kubenswrapper[4775]: healthz check failed Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.273817 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.396514 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8c496d7e-2613-433b-95bf-95257c8f2887","Type":"ContainerStarted","Data":"615bb5ca4122491fd3622f93c7b5426d8ae936e51bba95b8311c82f0837a87e9"} Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.977006 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.979396 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:23:01 crc kubenswrapper[4775]: I0127 11:23:01.411731 4775 generic.go:334] "Generic (PLEG): container finished" podID="8c496d7e-2613-433b-95bf-95257c8f2887" containerID="6eb47812cd5da34ef545cda10bc22d8bfb78abac828d9aedc2550c073bd33895" exitCode=0 Jan 27 11:23:01 crc kubenswrapper[4775]: I0127 11:23:01.412584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8c496d7e-2613-433b-95bf-95257c8f2887","Type":"ContainerDied","Data":"6eb47812cd5da34ef545cda10bc22d8bfb78abac828d9aedc2550c073bd33895"} Jan 27 11:23:02 crc kubenswrapper[4775]: I0127 11:23:02.980512 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:23:03 crc kubenswrapper[4775]: I0127 11:23:03.136123 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dqrtf" Jan 27 11:23:05 crc kubenswrapper[4775]: I0127 11:23:05.938519 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.082785 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir\") pod \"8c496d7e-2613-433b-95bf-95257c8f2887\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.082887 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access\") pod \"8c496d7e-2613-433b-95bf-95257c8f2887\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.083033 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8c496d7e-2613-433b-95bf-95257c8f2887" (UID: "8c496d7e-2613-433b-95bf-95257c8f2887"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.083361 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.101862 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8c496d7e-2613-433b-95bf-95257c8f2887" (UID: "8c496d7e-2613-433b-95bf-95257c8f2887"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.184176 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.452340 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8c496d7e-2613-433b-95bf-95257c8f2887","Type":"ContainerDied","Data":"615bb5ca4122491fd3622f93c7b5426d8ae936e51bba95b8311c82f0837a87e9"} Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.452383 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="615bb5ca4122491fd3622f93c7b5426d8ae936e51bba95b8311c82f0837a87e9" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.452437 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.994956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:23:07 crc kubenswrapper[4775]: I0127 11:23:07.016570 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:23:07 crc kubenswrapper[4775]: I0127 11:23:07.157465 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:23:07 crc kubenswrapper[4775]: I0127 11:23:07.276554 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:23:07 crc kubenswrapper[4775]: I0127 11:23:07.722846 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:23:07 crc kubenswrapper[4775]: I0127 11:23:07.727309 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:23:15 crc kubenswrapper[4775]: I0127 11:23:15.167795 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:23:19 crc kubenswrapper[4775]: E0127 11:23:19.149482 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 27 11:23:19 crc kubenswrapper[4775]: E0127 11:23:19.150031 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qsks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vkb7p_openshift-marketplace(f1ecb76d-1e7c-4889-ab6d-451e8b534308): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 11:23:19 crc kubenswrapper[4775]: E0127 11:23:19.151246 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vkb7p" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" Jan 27 11:23:20 crc kubenswrapper[4775]: E0127 11:23:20.893228 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vkb7p" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" Jan 27 11:23:21 crc kubenswrapper[4775]: E0127 11:23:21.903380 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 27 11:23:21 crc kubenswrapper[4775]: E0127 11:23:21.904832 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmb78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wpqn9_openshift-marketplace(5415a9cc-8755-41e6-bd7b-1542339cadc6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 11:23:21 crc kubenswrapper[4775]: E0127 11:23:21.906503 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wpqn9" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.584586 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wpqn9" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.615336 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.615590 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cpls5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lw9xz_openshift-marketplace(232e2caf-d6b3-47b9-9ca0-45aec1e95045): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.616772 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lw9xz" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.624721 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.624872 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7htj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-v5q62_openshift-marketplace(3ae6a7af-e7d7-440b-b7cb-366edba2d44e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.626060 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-v5q62" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.013884 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-b48nk"] Jan 27 11:23:28 crc kubenswrapper[4775]: W0127 11:23:28.050359 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc945c8b1_655c_4522_b703_0c5b9b8fcf38.slice/crio-cddb190166a58244de497325c6afc2e0460f2751730c8cab60313a4b6e9f7ed8 WatchSource:0}: Error finding container cddb190166a58244de497325c6afc2e0460f2751730c8cab60313a4b6e9f7ed8: Status 404 returned error can't find the container with id cddb190166a58244de497325c6afc2e0460f2751730c8cab60313a4b6e9f7ed8 Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.106957 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.593339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b48nk" event={"ID":"c945c8b1-655c-4522-b703-0c5b9b8fcf38","Type":"ContainerStarted","Data":"96a178553693bb50d5c7991838e3e488c6c839e4c6b764c6715f32a493532ed4"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.593868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b48nk" event={"ID":"c945c8b1-655c-4522-b703-0c5b9b8fcf38","Type":"ContainerStarted","Data":"bf6180b3257485dc4fb651f383e57a343ac1477cfeaf469138f42716d9d8c7b4"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.593903 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b48nk" event={"ID":"c945c8b1-655c-4522-b703-0c5b9b8fcf38","Type":"ContainerStarted","Data":"cddb190166a58244de497325c6afc2e0460f2751730c8cab60313a4b6e9f7ed8"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.597258 4775 generic.go:334] "Generic (PLEG): container finished" podID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerID="ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b" exitCode=0 Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.597360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerDied","Data":"ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.602940 4775 generic.go:334] "Generic (PLEG): container finished" podID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerID="2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796" exitCode=0 Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.602985 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerDied","Data":"2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.606904 4775 generic.go:334] "Generic (PLEG): container finished" podID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerID="a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575" exitCode=0 Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.606991 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerDied","Data":"a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.615357 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-b48nk" podStartSLOduration=163.615328212 podStartE2EDuration="2m43.615328212s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:23:28.609539115 +0000 UTC m=+187.751136912" watchObservedRunningTime="2026-01-27 11:23:28.615328212 +0000 UTC m=+187.756926009" Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.627873 4775 generic.go:334] "Generic (PLEG): container finished" podID="2b487540-88bb-496a-9aff-3f383cdc858b" containerID="af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6" exitCode=0 Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.628643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerDied","Data":"af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6"} Jan 27 11:23:28 crc kubenswrapper[4775]: E0127 11:23:28.631108 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-v5q62" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.517654 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.518301 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.638011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerStarted","Data":"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8"} Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.640870 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerStarted","Data":"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454"} Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.650826 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerStarted","Data":"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c"} Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.653338 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerStarted","Data":"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3"} Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.661705 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t4skp" podStartSLOduration=2.936087174 podStartE2EDuration="33.661688759s" podCreationTimestamp="2026-01-27 11:22:56 +0000 UTC" firstStartedPulling="2026-01-27 11:22:58.287281341 +0000 UTC m=+157.428879118" lastFinishedPulling="2026-01-27 11:23:29.012882936 +0000 UTC m=+188.154480703" observedRunningTime="2026-01-27 11:23:29.661090503 +0000 UTC m=+188.802688280" watchObservedRunningTime="2026-01-27 11:23:29.661688759 +0000 UTC m=+188.803286536" Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.682439 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5kd8m" podStartSLOduration=2.840422261 podStartE2EDuration="35.682400832s" podCreationTimestamp="2026-01-27 11:22:54 +0000 UTC" firstStartedPulling="2026-01-27 11:22:56.248931256 +0000 UTC m=+155.390529033" lastFinishedPulling="2026-01-27 11:23:29.090909787 +0000 UTC m=+188.232507604" observedRunningTime="2026-01-27 11:23:29.677572881 +0000 UTC m=+188.819170668" watchObservedRunningTime="2026-01-27 11:23:29.682400832 +0000 UTC m=+188.823998619" Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.703078 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s8snw" podStartSLOduration=1.74425539 podStartE2EDuration="35.703056553s" podCreationTimestamp="2026-01-27 11:22:54 +0000 UTC" firstStartedPulling="2026-01-27 11:22:55.228579156 +0000 UTC m=+154.370176933" lastFinishedPulling="2026-01-27 11:23:29.187380279 +0000 UTC m=+188.328978096" observedRunningTime="2026-01-27 11:23:29.701142721 +0000 UTC m=+188.842740518" watchObservedRunningTime="2026-01-27 11:23:29.703056553 +0000 UTC m=+188.844654330" Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.720104 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fchbb" podStartSLOduration=2.875851394 podStartE2EDuration="35.720080917s" podCreationTimestamp="2026-01-27 11:22:54 +0000 UTC" firstStartedPulling="2026-01-27 11:22:56.238937484 +0000 UTC m=+155.380535261" lastFinishedPulling="2026-01-27 11:23:29.083166997 +0000 UTC m=+188.224764784" observedRunningTime="2026-01-27 11:23:29.718797832 +0000 UTC m=+188.860395629" watchObservedRunningTime="2026-01-27 11:23:29.720080917 +0000 UTC m=+188.861678714" Jan 27 11:23:32 crc kubenswrapper[4775]: I0127 11:23:32.986441 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:23:34 crc kubenswrapper[4775]: I0127 11:23:34.646772 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:23:34 crc kubenswrapper[4775]: I0127 11:23:34.646826 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:23:34 crc kubenswrapper[4775]: I0127 11:23:34.796617 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:23:34 crc kubenswrapper[4775]: I0127 11:23:34.846897 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.059173 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.059218 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.096369 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.247939 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.248023 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.295181 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.446983 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 11:23:35 crc kubenswrapper[4775]: E0127 11:23:35.447224 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c496d7e-2613-433b-95bf-95257c8f2887" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.447237 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c496d7e-2613-433b-95bf-95257c8f2887" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: E0127 11:23:35.447259 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41348a87-6415-41fe-97a9-bcc552d7bc8e" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.447265 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="41348a87-6415-41fe-97a9-bcc552d7bc8e" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.447397 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c496d7e-2613-433b-95bf-95257c8f2887" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.447410 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="41348a87-6415-41fe-97a9-bcc552d7bc8e" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.447792 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.450540 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.451430 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.466581 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.533410 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.533548 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.635431 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.635554 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.635641 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.663269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.759887 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.760448 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.775545 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.818557 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.319510 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.695732 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71ab4c03-397b-4240-a3f7-c731b6b4331f","Type":"ContainerStarted","Data":"80d5ed4ac57818277b923d597464c0dbabfd266d76aea35480dbb132b9aabf48"} Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.696205 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71ab4c03-397b-4240-a3f7-c731b6b4331f","Type":"ContainerStarted","Data":"8bcadaabd27d0dc64ffd0d3ea07a0ea677bc056e052585f076381046957cde91"} Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.699581 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerStarted","Data":"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045"} Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.712654 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.71263967 podStartE2EDuration="1.71263967s" podCreationTimestamp="2026-01-27 11:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:23:36.71043325 +0000 UTC m=+195.852031027" watchObservedRunningTime="2026-01-27 11:23:36.71263967 +0000 UTC m=+195.854237447" Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.924875 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.255275 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.255327 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.295006 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.525598 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.708215 4775 generic.go:334] "Generic (PLEG): container finished" podID="71ab4c03-397b-4240-a3f7-c731b6b4331f" containerID="80d5ed4ac57818277b923d597464c0dbabfd266d76aea35480dbb132b9aabf48" exitCode=0 Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.708292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71ab4c03-397b-4240-a3f7-c731b6b4331f","Type":"ContainerDied","Data":"80d5ed4ac57818277b923d597464c0dbabfd266d76aea35480dbb132b9aabf48"} Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.711774 4775 generic.go:334] "Generic (PLEG): container finished" podID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerID="0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045" exitCode=0 Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.711882 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerDied","Data":"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045"} Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.712632 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5kd8m" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="registry-server" containerID="cri-o://b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454" gracePeriod=2 Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.712965 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fchbb" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="registry-server" containerID="cri-o://88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3" gracePeriod=2 Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.763590 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.193430 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.265685 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities\") pod \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.266311 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content\") pod \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.266425 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z4x5\" (UniqueName: \"kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5\") pod \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.273417 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5" (OuterVolumeSpecName: "kube-api-access-6z4x5") pod "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" (UID: "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3"). InnerVolumeSpecName "kube-api-access-6z4x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.278329 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities" (OuterVolumeSpecName: "utilities") pod "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" (UID: "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.279407 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.327668 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" (UID: "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367382 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfzg9\" (UniqueName: \"kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9\") pod \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367532 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities\") pod \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367572 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content\") pod \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367844 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z4x5\" (UniqueName: \"kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367888 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367899 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.368418 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities" (OuterVolumeSpecName: "utilities") pod "57a822f4-b93b-497d-bfc6-cf4f13cc8140" (UID: "57a822f4-b93b-497d-bfc6-cf4f13cc8140"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.370596 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9" (OuterVolumeSpecName: "kube-api-access-xfzg9") pod "57a822f4-b93b-497d-bfc6-cf4f13cc8140" (UID: "57a822f4-b93b-497d-bfc6-cf4f13cc8140"). InnerVolumeSpecName "kube-api-access-xfzg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.414213 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a822f4-b93b-497d-bfc6-cf4f13cc8140" (UID: "57a822f4-b93b-497d-bfc6-cf4f13cc8140"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.468839 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfzg9\" (UniqueName: \"kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.468877 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.468887 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.719104 4775 generic.go:334] "Generic (PLEG): container finished" podID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerID="b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454" exitCode=0 Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.719153 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerDied","Data":"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454"} Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.719190 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.719216 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerDied","Data":"290297c00f0444d9d550e5200aba7133d2e252d2cc5cddaaa7f26158bf4b0fff"} Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.719239 4775 scope.go:117] "RemoveContainer" containerID="b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.721238 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerStarted","Data":"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1"} Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.728113 4775 generic.go:334] "Generic (PLEG): container finished" podID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerID="88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3" exitCode=0 Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.728152 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.728228 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerDied","Data":"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3"} Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.728263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerDied","Data":"6ed253dee07a34146473ff3556fd2212e703a1c328ef425adac33fe2a7fe4fa8"} Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.741461 4775 scope.go:117] "RemoveContainer" containerID="a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.751642 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vkb7p" podStartSLOduration=2.817677543 podStartE2EDuration="44.751601492s" podCreationTimestamp="2026-01-27 11:22:54 +0000 UTC" firstStartedPulling="2026-01-27 11:22:56.256214444 +0000 UTC m=+155.397812211" lastFinishedPulling="2026-01-27 11:23:38.190138383 +0000 UTC m=+197.331736160" observedRunningTime="2026-01-27 11:23:38.748008942 +0000 UTC m=+197.889606719" watchObservedRunningTime="2026-01-27 11:23:38.751601492 +0000 UTC m=+197.893199269" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.766430 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.773359 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.783739 4775 scope.go:117] "RemoveContainer" containerID="e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.798991 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.805264 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.826003 4775 scope.go:117] "RemoveContainer" containerID="b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.826467 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454\": container with ID starting with b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454 not found: ID does not exist" containerID="b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.826509 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454"} err="failed to get container status \"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454\": rpc error: code = NotFound desc = could not find container \"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454\": container with ID starting with b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454 not found: ID does not exist" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.826559 4775 scope.go:117] "RemoveContainer" containerID="a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.826948 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575\": container with ID starting with a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575 not found: ID does not exist" containerID="a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.827020 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575"} err="failed to get container status \"a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575\": rpc error: code = NotFound desc = could not find container \"a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575\": container with ID starting with a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575 not found: ID does not exist" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.827057 4775 scope.go:117] "RemoveContainer" containerID="e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.827312 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371\": container with ID starting with e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371 not found: ID does not exist" containerID="e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.827342 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371"} err="failed to get container status \"e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371\": rpc error: code = NotFound desc = could not find container \"e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371\": container with ID starting with e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371 not found: ID does not exist" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.827368 4775 scope.go:117] "RemoveContainer" containerID="88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.844546 4775 scope.go:117] "RemoveContainer" containerID="ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.874023 4775 scope.go:117] "RemoveContainer" containerID="13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.901376 4775 scope.go:117] "RemoveContainer" containerID="88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.902860 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3\": container with ID starting with 88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3 not found: ID does not exist" containerID="88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.902891 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3"} err="failed to get container status \"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3\": rpc error: code = NotFound desc = could not find container \"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3\": container with ID starting with 88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3 not found: ID does not exist" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.902926 4775 scope.go:117] "RemoveContainer" containerID="ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.907706 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b\": container with ID starting with ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b not found: ID does not exist" containerID="ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.907755 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b"} err="failed to get container status \"ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b\": rpc error: code = NotFound desc = could not find container \"ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b\": container with ID starting with ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b not found: ID does not exist" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.907785 4775 scope.go:117] "RemoveContainer" containerID="13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.908263 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689\": container with ID starting with 13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689 not found: ID does not exist" containerID="13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.908292 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689"} err="failed to get container status \"13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689\": rpc error: code = NotFound desc = could not find container \"13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689\": container with ID starting with 13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689 not found: ID does not exist" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.098496 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.177552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access\") pod \"71ab4c03-397b-4240-a3f7-c731b6b4331f\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.177634 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir\") pod \"71ab4c03-397b-4240-a3f7-c731b6b4331f\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.177987 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "71ab4c03-397b-4240-a3f7-c731b6b4331f" (UID: "71ab4c03-397b-4240-a3f7-c731b6b4331f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.181881 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "71ab4c03-397b-4240-a3f7-c731b6b4331f" (UID: "71ab4c03-397b-4240-a3f7-c731b6b4331f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.279425 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.279499 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.735608 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71ab4c03-397b-4240-a3f7-c731b6b4331f","Type":"ContainerDied","Data":"8bcadaabd27d0dc64ffd0d3ea07a0ea677bc056e052585f076381046957cde91"} Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.735644 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bcadaabd27d0dc64ffd0d3ea07a0ea677bc056e052585f076381046957cde91" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.735693 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.757190 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" path="/var/lib/kubelet/pods/57a822f4-b93b-497d-bfc6-cf4f13cc8140/volumes" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.757922 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" path="/var/lib/kubelet/pods/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3/volumes" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.927490 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.927745 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t4skp" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="registry-server" containerID="cri-o://09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8" gracePeriod=2 Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.416203 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.494133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content\") pod \"615aabb4-e21b-4941-ba5d-d6148cee87af\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.494218 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities\") pod \"615aabb4-e21b-4941-ba5d-d6148cee87af\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.494281 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqkzd\" (UniqueName: \"kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd\") pod \"615aabb4-e21b-4941-ba5d-d6148cee87af\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.495130 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities" (OuterVolumeSpecName: "utilities") pod "615aabb4-e21b-4941-ba5d-d6148cee87af" (UID: "615aabb4-e21b-4941-ba5d-d6148cee87af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.498895 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd" (OuterVolumeSpecName: "kube-api-access-fqkzd") pod "615aabb4-e21b-4941-ba5d-d6148cee87af" (UID: "615aabb4-e21b-4941-ba5d-d6148cee87af"). InnerVolumeSpecName "kube-api-access-fqkzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.519322 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "615aabb4-e21b-4941-ba5d-d6148cee87af" (UID: "615aabb4-e21b-4941-ba5d-d6148cee87af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.595849 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.595883 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.595893 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqkzd\" (UniqueName: \"kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.743077 4775 generic.go:334] "Generic (PLEG): container finished" podID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerID="09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8" exitCode=0 Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.743157 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerDied","Data":"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8"} Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.743168 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.743190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerDied","Data":"67afb1f7244fe2812d1d4acb97266d9ec82321b9084603e7c3e8b7b7b66acb18"} Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.743209 4775 scope.go:117] "RemoveContainer" containerID="09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.756201 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerStarted","Data":"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154"} Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.773543 4775 scope.go:117] "RemoveContainer" containerID="2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.784151 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.788048 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.796370 4775 scope.go:117] "RemoveContainer" containerID="15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.814827 4775 scope.go:117] "RemoveContainer" containerID="09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8" Jan 27 11:23:40 crc kubenswrapper[4775]: E0127 11:23:40.815337 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8\": container with ID starting with 09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8 not found: ID does not exist" containerID="09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.815365 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8"} err="failed to get container status \"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8\": rpc error: code = NotFound desc = could not find container \"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8\": container with ID starting with 09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8 not found: ID does not exist" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.815388 4775 scope.go:117] "RemoveContainer" containerID="2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796" Jan 27 11:23:40 crc kubenswrapper[4775]: E0127 11:23:40.815886 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796\": container with ID starting with 2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796 not found: ID does not exist" containerID="2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.815938 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796"} err="failed to get container status \"2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796\": rpc error: code = NotFound desc = could not find container \"2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796\": container with ID starting with 2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796 not found: ID does not exist" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.815976 4775 scope.go:117] "RemoveContainer" containerID="15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae" Jan 27 11:23:40 crc kubenswrapper[4775]: E0127 11:23:40.816286 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae\": container with ID starting with 15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae not found: ID does not exist" containerID="15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.816320 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae"} err="failed to get container status \"15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae\": rpc error: code = NotFound desc = could not find container \"15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae\": container with ID starting with 15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae not found: ID does not exist" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.764140 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" path="/var/lib/kubelet/pods/615aabb4-e21b-4941-ba5d-d6148cee87af/volumes" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.769826 4775 generic.go:334] "Generic (PLEG): container finished" podID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerID="45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154" exitCode=0 Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.769965 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerDied","Data":"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154"} Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852301 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852842 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852877 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852895 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852902 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852910 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852916 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852924 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852930 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852939 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852947 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852956 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852962 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852968 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852974 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852985 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ab4c03-397b-4240-a3f7-c731b6b4331f" containerName="pruner" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852990 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ab4c03-397b-4240-a3f7-c731b6b4331f" containerName="pruner" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852996 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853002 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.853009 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853015 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853124 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ab4c03-397b-4240-a3f7-c731b6b4331f" containerName="pruner" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853137 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853151 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853161 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853686 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.856205 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.856396 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.857991 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.912932 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.913025 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.913169 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.015619 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.015717 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.015929 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.015990 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.016090 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.043031 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.178167 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.697583 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 11:23:42 crc kubenswrapper[4775]: W0127 11:23:42.705362 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5c9821b5_66df_49d6_a096_1494e7cdda93.slice/crio-3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f WatchSource:0}: Error finding container 3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f: Status 404 returned error can't find the container with id 3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.798048 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c9821b5-66df-49d6-a096-1494e7cdda93","Type":"ContainerStarted","Data":"3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f"} Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.802937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerStarted","Data":"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7"} Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.818501 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lw9xz" podStartSLOduration=3.930616983 podStartE2EDuration="45.81848292s" podCreationTimestamp="2026-01-27 11:22:57 +0000 UTC" firstStartedPulling="2026-01-27 11:23:00.39935255 +0000 UTC m=+159.540950337" lastFinishedPulling="2026-01-27 11:23:42.287218497 +0000 UTC m=+201.428816274" observedRunningTime="2026-01-27 11:23:42.815170989 +0000 UTC m=+201.956768766" watchObservedRunningTime="2026-01-27 11:23:42.81848292 +0000 UTC m=+201.960080697" Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.821913 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c9821b5-66df-49d6-a096-1494e7cdda93","Type":"ContainerStarted","Data":"b0b65d278ea26de83de7ace1b46cf4ec14821a9e08e9d49fba53055de852bf72"} Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.825170 4775 generic.go:334] "Generic (PLEG): container finished" podID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerID="5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79" exitCode=0 Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.825233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerDied","Data":"5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79"} Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.827658 4775 generic.go:334] "Generic (PLEG): container finished" podID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerID="6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042" exitCode=0 Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.827725 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerDied","Data":"6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042"} Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.846156 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.846136003 podStartE2EDuration="2.846136003s" podCreationTimestamp="2026-01-27 11:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:23:43.84277095 +0000 UTC m=+202.984368737" watchObservedRunningTime="2026-01-27 11:23:43.846136003 +0000 UTC m=+202.987733780" Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.835017 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerStarted","Data":"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27"} Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.838247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerStarted","Data":"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53"} Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.841499 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.842280 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.860721 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v5q62" podStartSLOduration=2.876722269 podStartE2EDuration="47.860702886s" podCreationTimestamp="2026-01-27 11:22:57 +0000 UTC" firstStartedPulling="2026-01-27 11:22:59.309918423 +0000 UTC m=+158.451516200" lastFinishedPulling="2026-01-27 11:23:44.29389903 +0000 UTC m=+203.435496817" observedRunningTime="2026-01-27 11:23:44.857801377 +0000 UTC m=+203.999399164" watchObservedRunningTime="2026-01-27 11:23:44.860702886 +0000 UTC m=+204.002300663" Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.872957 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wpqn9" podStartSLOduration=2.7067073649999998 podStartE2EDuration="48.872934809s" podCreationTimestamp="2026-01-27 11:22:56 +0000 UTC" firstStartedPulling="2026-01-27 11:22:58.287563889 +0000 UTC m=+157.429161666" lastFinishedPulling="2026-01-27 11:23:44.453791323 +0000 UTC m=+203.595389110" observedRunningTime="2026-01-27 11:23:44.870716591 +0000 UTC m=+204.012314368" watchObservedRunningTime="2026-01-27 11:23:44.872934809 +0000 UTC m=+204.014532586" Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.895353 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:23:45 crc kubenswrapper[4775]: I0127 11:23:45.897897 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:23:46 crc kubenswrapper[4775]: I0127 11:23:46.854680 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:23:46 crc kubenswrapper[4775]: I0127 11:23:46.855045 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:23:46 crc kubenswrapper[4775]: I0127 11:23:46.907576 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:23:47 crc kubenswrapper[4775]: I0127 11:23:47.865910 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:23:47 crc kubenswrapper[4775]: I0127 11:23:47.866238 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:23:48 crc kubenswrapper[4775]: I0127 11:23:48.243302 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:48 crc kubenswrapper[4775]: I0127 11:23:48.243348 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:48 crc kubenswrapper[4775]: I0127 11:23:48.304576 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:48 crc kubenswrapper[4775]: I0127 11:23:48.902814 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v5q62" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="registry-server" probeResult="failure" output=< Jan 27 11:23:48 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:23:48 crc kubenswrapper[4775]: > Jan 27 11:23:48 crc kubenswrapper[4775]: I0127 11:23:48.919484 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.329177 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.329919 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lw9xz" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="registry-server" containerID="cri-o://3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7" gracePeriod=2 Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.866827 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.884804 4775 generic.go:334] "Generic (PLEG): container finished" podID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerID="3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7" exitCode=0 Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.884850 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerDied","Data":"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7"} Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.884880 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.884910 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerDied","Data":"03e04acda80c448d05ec4f5110391d0366fd0fc80d319a5ee3f1ee1c23fc4573"} Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.884932 4775 scope.go:117] "RemoveContainer" containerID="3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.911595 4775 scope.go:117] "RemoveContainer" containerID="45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.933871 4775 scope.go:117] "RemoveContainer" containerID="8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.949644 4775 scope.go:117] "RemoveContainer" containerID="3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7" Jan 27 11:23:51 crc kubenswrapper[4775]: E0127 11:23:51.955041 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7\": container with ID starting with 3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7 not found: ID does not exist" containerID="3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.955088 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7"} err="failed to get container status \"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7\": rpc error: code = NotFound desc = could not find container \"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7\": container with ID starting with 3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7 not found: ID does not exist" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.955122 4775 scope.go:117] "RemoveContainer" containerID="45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154" Jan 27 11:23:51 crc kubenswrapper[4775]: E0127 11:23:51.955752 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154\": container with ID starting with 45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154 not found: ID does not exist" containerID="45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.955788 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154"} err="failed to get container status \"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154\": rpc error: code = NotFound desc = could not find container \"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154\": container with ID starting with 45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154 not found: ID does not exist" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.955807 4775 scope.go:117] "RemoveContainer" containerID="8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4" Jan 27 11:23:51 crc kubenswrapper[4775]: E0127 11:23:51.956198 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4\": container with ID starting with 8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4 not found: ID does not exist" containerID="8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.956227 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4"} err="failed to get container status \"8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4\": rpc error: code = NotFound desc = could not find container \"8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4\": container with ID starting with 8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4 not found: ID does not exist" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.052045 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content\") pod \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.052116 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities\") pod \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.052164 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpls5\" (UniqueName: \"kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5\") pod \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.053897 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities" (OuterVolumeSpecName: "utilities") pod "232e2caf-d6b3-47b9-9ca0-45aec1e95045" (UID: "232e2caf-d6b3-47b9-9ca0-45aec1e95045"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.060252 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5" (OuterVolumeSpecName: "kube-api-access-cpls5") pod "232e2caf-d6b3-47b9-9ca0-45aec1e95045" (UID: "232e2caf-d6b3-47b9-9ca0-45aec1e95045"). InnerVolumeSpecName "kube-api-access-cpls5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.154078 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.154134 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpls5\" (UniqueName: \"kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.216886 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "232e2caf-d6b3-47b9-9ca0-45aec1e95045" (UID: "232e2caf-d6b3-47b9-9ca0-45aec1e95045"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.255346 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.522696 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.530040 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:23:53 crc kubenswrapper[4775]: I0127 11:23:53.755142 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" path="/var/lib/kubelet/pods/232e2caf-d6b3-47b9-9ca0-45aec1e95045/volumes" Jan 27 11:23:56 crc kubenswrapper[4775]: I0127 11:23:56.921250 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:23:57 crc kubenswrapper[4775]: I0127 11:23:57.904658 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:23:57 crc kubenswrapper[4775]: I0127 11:23:57.964295 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.517744 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.517809 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.517861 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.518444 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.518542 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4" gracePeriod=600 Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.945065 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4" exitCode=0 Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.945135 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4"} Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.945496 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8"} Jan 27 11:24:00 crc kubenswrapper[4775]: I0127 11:24:00.850595 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerName="oauth-openshift" containerID="cri-o://505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3" gracePeriod=15 Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.319902 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.354953 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-b29mm"] Jan 27 11:24:01 crc kubenswrapper[4775]: E0127 11:24:01.355278 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="extract-content" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355306 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="extract-content" Jan 27 11:24:01 crc kubenswrapper[4775]: E0127 11:24:01.355329 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="extract-utilities" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355342 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="extract-utilities" Jan 27 11:24:01 crc kubenswrapper[4775]: E0127 11:24:01.355374 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerName="oauth-openshift" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355387 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerName="oauth-openshift" Jan 27 11:24:01 crc kubenswrapper[4775]: E0127 11:24:01.355412 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="registry-server" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355425 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="registry-server" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355645 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="registry-server" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355684 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerName="oauth-openshift" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.356256 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.379275 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-b29mm"] Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489274 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489673 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489735 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489765 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbwzf\" (UniqueName: \"kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489815 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489847 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489882 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489912 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489940 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489977 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.490018 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.490047 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.490558 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.490786 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491072 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491120 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491154 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491176 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-dir\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491196 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491217 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491241 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491267 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491292 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vktzf\" (UniqueName: \"kubernetes.io/projected/4c8218f8-a92b-41ec-bbc0-56ab92db9285-kube-api-access-vktzf\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491322 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491347 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491334 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491379 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491370 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491535 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491799 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-policies\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491825 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491860 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.492037 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.492062 4775 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.492076 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.492088 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.492100 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.496236 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.496628 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.496866 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.497215 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.498153 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.498173 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.498663 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.499567 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.501861 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf" (OuterVolumeSpecName: "kube-api-access-vbwzf") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "kube-api-access-vbwzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593152 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-dir\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593191 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593210 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593228 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593264 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593281 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vktzf\" (UniqueName: \"kubernetes.io/projected/4c8218f8-a92b-41ec-bbc0-56ab92db9285-kube-api-access-vktzf\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593300 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593315 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593330 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593348 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593375 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-policies\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593427 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593424 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-dir\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593472 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.594260 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.594385 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.595163 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.595311 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-policies\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596124 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596251 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596340 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596426 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596540 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596655 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbwzf\" (UniqueName: \"kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596780 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596898 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596495 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.599531 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.599604 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.599596 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.599879 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.599995 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.602413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.603403 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.604728 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.612535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vktzf\" (UniqueName: \"kubernetes.io/projected/4c8218f8-a92b-41ec-bbc0-56ab92db9285-kube-api-access-vktzf\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.680845 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.964867 4775 generic.go:334] "Generic (PLEG): container finished" podID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerID="505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3" exitCode=0 Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.964934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" event={"ID":"27ef9f09-90fd-490f-a8b6-912a84eb05c5","Type":"ContainerDied","Data":"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3"} Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.964978 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" event={"ID":"27ef9f09-90fd-490f-a8b6-912a84eb05c5","Type":"ContainerDied","Data":"71f62b9e07cf144d54a44160698a6e892c6a6b7a96fbedaace452d7e78d81f2c"} Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.965006 4775 scope.go:117] "RemoveContainer" containerID="505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.965290 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.001869 4775 scope.go:117] "RemoveContainer" containerID="505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3" Jan 27 11:24:02 crc kubenswrapper[4775]: E0127 11:24:02.003443 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3\": container with ID starting with 505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3 not found: ID does not exist" containerID="505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3" Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.003562 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3"} err="failed to get container status \"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3\": rpc error: code = NotFound desc = could not find container \"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3\": container with ID starting with 505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3 not found: ID does not exist" Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.019642 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.023405 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.151009 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-b29mm"] Jan 27 11:24:02 crc kubenswrapper[4775]: W0127 11:24:02.157002 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8218f8_a92b_41ec_bbc0_56ab92db9285.slice/crio-1852f953666a58b7472321977a7aca9e148ae526623aa7022fa13afae03f8073 WatchSource:0}: Error finding container 1852f953666a58b7472321977a7aca9e148ae526623aa7022fa13afae03f8073: Status 404 returned error can't find the container with id 1852f953666a58b7472321977a7aca9e148ae526623aa7022fa13afae03f8073 Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.980803 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" event={"ID":"4c8218f8-a92b-41ec-bbc0-56ab92db9285","Type":"ContainerStarted","Data":"e992f4f62fbcfcda8d01ff36a4a42daec58c7a0934049b58e4704240fe47eb7b"} Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.980853 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" event={"ID":"4c8218f8-a92b-41ec-bbc0-56ab92db9285","Type":"ContainerStarted","Data":"1852f953666a58b7472321977a7aca9e148ae526623aa7022fa13afae03f8073"} Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.981108 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:03 crc kubenswrapper[4775]: I0127 11:24:03.003222 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" podStartSLOduration=28.003202375 podStartE2EDuration="28.003202375s" podCreationTimestamp="2026-01-27 11:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:24:03.002307157 +0000 UTC m=+222.143904954" watchObservedRunningTime="2026-01-27 11:24:03.003202375 +0000 UTC m=+222.144800162" Jan 27 11:24:03 crc kubenswrapper[4775]: I0127 11:24:03.019346 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:03 crc kubenswrapper[4775]: I0127 11:24:03.753682 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" path="/var/lib/kubelet/pods/27ef9f09-90fd-490f-a8b6-912a84eb05c5/volumes" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.637043 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.638586 4775 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.638716 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639138 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629" gracePeriod=15 Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639175 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699" gracePeriod=15 Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639201 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89" gracePeriod=15 Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639173 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c" gracePeriod=15 Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639247 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57" gracePeriod=15 Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639420 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639663 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639684 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639700 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639710 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639724 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639732 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639744 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639751 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639761 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639768 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639781 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639788 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639800 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639807 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639817 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639824 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639941 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639956 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639970 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639979 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639990 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.640001 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.640012 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704082 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704473 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704500 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704549 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704585 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704632 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704655 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.742218 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805615 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805681 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805729 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805746 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805772 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805793 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.806319 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.806911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.806977 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807014 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807047 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807168 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807206 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807228 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807774 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807805 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.032190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:21 crc kubenswrapper[4775]: E0127 11:24:21.059122 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e92c0f5c582b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 11:24:21.057921716 +0000 UTC m=+240.199519533,LastTimestamp:2026-01-27 11:24:21.057921716 +0000 UTC m=+240.199519533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.101489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7f62667413fd238f0fa32d4f5ea8db5ada528df2641989942fb36daae3dce93c"} Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.103440 4775 generic.go:334] "Generic (PLEG): container finished" podID="5c9821b5-66df-49d6-a096-1494e7cdda93" containerID="b0b65d278ea26de83de7ace1b46cf4ec14821a9e08e9d49fba53055de852bf72" exitCode=0 Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.103548 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c9821b5-66df-49d6-a096-1494e7cdda93","Type":"ContainerDied","Data":"b0b65d278ea26de83de7ace1b46cf4ec14821a9e08e9d49fba53055de852bf72"} Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.105035 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.108739 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.109673 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.110469 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.112061 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.112979 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c" exitCode=0 Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.113016 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89" exitCode=0 Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.113029 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699" exitCode=0 Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.113037 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57" exitCode=2 Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.113085 4775 scope.go:117] "RemoveContainer" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.760308 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.761114 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.761575 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.127655 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.132604 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d"} Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.133629 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.134172 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.384365 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.384997 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.385343 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.428974 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access\") pod \"5c9821b5-66df-49d6-a096-1494e7cdda93\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429009 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir\") pod \"5c9821b5-66df-49d6-a096-1494e7cdda93\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429070 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock\") pod \"5c9821b5-66df-49d6-a096-1494e7cdda93\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5c9821b5-66df-49d6-a096-1494e7cdda93" (UID: "5c9821b5-66df-49d6-a096-1494e7cdda93"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429266 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock" (OuterVolumeSpecName: "var-lock") pod "5c9821b5-66df-49d6-a096-1494e7cdda93" (UID: "5c9821b5-66df-49d6-a096-1494e7cdda93"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429513 4775 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429535 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.434253 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5c9821b5-66df-49d6-a096-1494e7cdda93" (UID: "5c9821b5-66df-49d6-a096-1494e7cdda93"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.531477 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.104203 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.105005 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.105661 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.106127 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.106600 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.137931 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c9821b5-66df-49d6-a096-1494e7cdda93","Type":"ContainerDied","Data":"3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f"} Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.137954 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.137964 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.140280 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.141284 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629" exitCode=0 Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.141474 4775 scope.go:117] "RemoveContainer" containerID="fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.141865 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.150727 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.151139 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.151492 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.157299 4775 scope.go:117] "RemoveContainer" containerID="f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.172265 4775 scope.go:117] "RemoveContainer" containerID="80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.187104 4775 scope.go:117] "RemoveContainer" containerID="ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.199779 4775 scope.go:117] "RemoveContainer" containerID="169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.213701 4775 scope.go:117] "RemoveContainer" containerID="55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.230371 4775 scope.go:117] "RemoveContainer" containerID="fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.231478 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\": container with ID starting with fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c not found: ID does not exist" containerID="fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.231522 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c"} err="failed to get container status \"fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\": rpc error: code = NotFound desc = could not find container \"fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\": container with ID starting with fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.231552 4775 scope.go:117] "RemoveContainer" containerID="f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.231872 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\": container with ID starting with f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89 not found: ID does not exist" containerID="f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.231937 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89"} err="failed to get container status \"f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\": rpc error: code = NotFound desc = could not find container \"f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\": container with ID starting with f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89 not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.231983 4775 scope.go:117] "RemoveContainer" containerID="80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.232331 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\": container with ID starting with 80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699 not found: ID does not exist" containerID="80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.232361 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699"} err="failed to get container status \"80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\": rpc error: code = NotFound desc = could not find container \"80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\": container with ID starting with 80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699 not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.232380 4775 scope.go:117] "RemoveContainer" containerID="ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.232771 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\": container with ID starting with ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57 not found: ID does not exist" containerID="ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.232818 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57"} err="failed to get container status \"ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\": rpc error: code = NotFound desc = could not find container \"ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\": container with ID starting with ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57 not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.232850 4775 scope.go:117] "RemoveContainer" containerID="169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.233360 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\": container with ID starting with 169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629 not found: ID does not exist" containerID="169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.233388 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629"} err="failed to get container status \"169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\": rpc error: code = NotFound desc = could not find container \"169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\": container with ID starting with 169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629 not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.233406 4775 scope.go:117] "RemoveContainer" containerID="55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.233718 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\": container with ID starting with 55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685 not found: ID does not exist" containerID="55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.233765 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685"} err="failed to get container status \"55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\": rpc error: code = NotFound desc = could not find container \"55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\": container with ID starting with 55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685 not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240208 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240280 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240333 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240358 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240425 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240563 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240612 4775 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240629 4775 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.342099 4775 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.455333 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.455818 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.456165 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.752308 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.138763 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e92c0f5c582b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 11:24:21.057921716 +0000 UTC m=+240.199519533,LastTimestamp:2026-01-27 11:24:21.057921716 +0000 UTC m=+240.199519533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.163671 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.164338 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.164776 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.165431 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.166174 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:29 crc kubenswrapper[4775]: I0127 11:24:29.166224 4775 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.166661 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="200ms" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.367345 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="400ms" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.768803 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="800ms" Jan 27 11:24:30 crc kubenswrapper[4775]: E0127 11:24:30.570480 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="1.6s" Jan 27 11:24:31 crc kubenswrapper[4775]: I0127 11:24:31.749379 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:31 crc kubenswrapper[4775]: I0127 11:24:31.750828 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:32 crc kubenswrapper[4775]: E0127 11:24:32.173496 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="3.2s" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.217146 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.217966 4775 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058" exitCode=1 Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.218011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058"} Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.218886 4775 scope.go:117] "RemoveContainer" containerID="e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.219428 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.220772 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.222916 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: E0127 11:24:35.375261 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="6.4s" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.744677 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.745637 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.746200 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.746813 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.776748 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.776795 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:35 crc kubenswrapper[4775]: E0127 11:24:35.777894 4775 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.778684 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:35 crc kubenswrapper[4775]: W0127 11:24:35.806586 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-58c3d54301c33cc5876c77f2cf966c89f35e2471fc11def3283e8f703120667f WatchSource:0}: Error finding container 58c3d54301c33cc5876c77f2cf966c89f35e2471fc11def3283e8f703120667f: Status 404 returned error can't find the container with id 58c3d54301c33cc5876c77f2cf966c89f35e2471fc11def3283e8f703120667f Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.025734 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225003 4775 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4329eb3a61399ce0eceab91ddd8193e207cb20c01a496f40e5dd919acf58610d" exitCode=0 Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4329eb3a61399ce0eceab91ddd8193e207cb20c01a496f40e5dd919acf58610d"} Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225095 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"58c3d54301c33cc5876c77f2cf966c89f35e2471fc11def3283e8f703120667f"} Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225370 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225383 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225785 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:36 crc kubenswrapper[4775]: E0127 11:24:36.225927 4775 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.226049 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.226298 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.229873 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.229934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ef5a81e518ca785a434d6b7a0dee3b7169508b0e02080e4d4bd936956d71c34d"} Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.230839 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.231304 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.231681 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.242697 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6d1d7fce1e44af4af008b17532db5642d39fc1175df717b330e89ccd272c0655"} Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.243068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f341f77111d8af986366f6a41d8857f5fed9bde322762eeb2c45dcdac876be8a"} Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.243088 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d4be0d78c406c2a2f3155a654c43c68c680ad827ab6c7b59733fb13fb1d1d197"} Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.243102 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"df4fb11c5c8d9f1acf57252492a6144675a36c93322cbcf4ffc12aef2c80a277"} Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.753005 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.757662 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:24:38 crc kubenswrapper[4775]: I0127 11:24:38.250328 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"247b34de559934811170768cc750edf0379f5cbb22b9a8a6f0721defd0aa3dc1"} Jan 27 11:24:38 crc kubenswrapper[4775]: I0127 11:24:38.251050 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:24:38 crc kubenswrapper[4775]: I0127 11:24:38.250669 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:38 crc kubenswrapper[4775]: I0127 11:24:38.251238 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:40 crc kubenswrapper[4775]: I0127 11:24:40.780122 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:40 crc kubenswrapper[4775]: I0127 11:24:40.780523 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:40 crc kubenswrapper[4775]: I0127 11:24:40.789398 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:43 crc kubenswrapper[4775]: I0127 11:24:43.262861 4775 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:44 crc kubenswrapper[4775]: I0127 11:24:44.285275 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:44 crc kubenswrapper[4775]: I0127 11:24:44.285292 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:44 crc kubenswrapper[4775]: I0127 11:24:44.285609 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:44 crc kubenswrapper[4775]: I0127 11:24:44.288853 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:44 crc kubenswrapper[4775]: I0127 11:24:44.290734 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e34a4318-042c-4b5f-8f23-f4d269294fe1" Jan 27 11:24:45 crc kubenswrapper[4775]: I0127 11:24:45.289831 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:45 crc kubenswrapper[4775]: I0127 11:24:45.289869 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:46 crc kubenswrapper[4775]: I0127 11:24:46.029295 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:24:51 crc kubenswrapper[4775]: I0127 11:24:51.777142 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e34a4318-042c-4b5f-8f23-f4d269294fe1" Jan 27 11:24:52 crc kubenswrapper[4775]: I0127 11:24:52.186820 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 11:24:53 crc kubenswrapper[4775]: I0127 11:24:53.136144 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 11:24:53 crc kubenswrapper[4775]: I0127 11:24:53.283123 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 11:24:53 crc kubenswrapper[4775]: I0127 11:24:53.885555 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.005825 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.619805 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.622116 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.704064 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.708364 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.849668 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.859936 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.902923 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.933687 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.024552 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.099527 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.141689 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.143419 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.191434 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.681200 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.734307 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.787685 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.994895 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.074349 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.078947 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.280651 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.290905 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.320559 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.386350 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.388131 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.403505 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.432753 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.479938 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.484934 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.505296 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.600184 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.684645 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.698109 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.700599 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.720833 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.830864 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.976679 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.008425 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.110530 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.190211 4775 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.315719 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.363918 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.442708 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.523487 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.560310 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.627026 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.652870 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.654388 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.661923 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.693046 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.699705 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.842692 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.866652 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.148703 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.185139 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.253932 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.291896 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.377813 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.380161 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.417546 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.449809 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.529371 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.591549 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.605198 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.606839 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.619809 4775 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.695604 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.744586 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.797700 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.881862 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.940533 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.007885 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.075895 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.143162 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.327834 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.408350 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.418341 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.534488 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.586703 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.657694 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.659726 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.676210 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.760137 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.760138 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.767926 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.770052 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.909329 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.931245 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.046032 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.047338 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.090901 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.129941 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.144309 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.148487 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.151224 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.153369 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.165311 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.167234 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.310343 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.361332 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.366714 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.451559 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.469153 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.507528 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.570109 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.574658 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.677292 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:00.798636 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:00.818136 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:00.972909 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.274048 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.312902 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.342468 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.356546 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.452718 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.507540 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.553170 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.609233 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.669300 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.689662 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.719914 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.755263 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.764641 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.784849 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.793679 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.852241 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.924302 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.942973 4775 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.184982 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.195259 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.210145 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.269295 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.313419 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.402845 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.412068 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.532311 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.578943 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.605854 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.631262 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.695937 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.747063 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.786368 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.893157 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.943282 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.965389 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.051092 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.077889 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.136341 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.194498 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.197503 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.265753 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.381646 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.444909 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.479114 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.488909 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.490636 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.665604 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.678109 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.692313 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.716756 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.752641 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.782446 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.814905 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.859219 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.918993 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.972264 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.020223 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.041915 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.047739 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.245679 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.353786 4775 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.361923 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.361892127 podStartE2EDuration="44.361892127s" podCreationTimestamp="2026-01-27 11:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:24:42.986922663 +0000 UTC m=+262.128520450" watchObservedRunningTime="2026-01-27 11:25:04.361892127 +0000 UTC m=+283.503489944" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.363841 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.363966 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.364017 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9","openshift-marketplace/community-operators-vkb7p","openshift-marketplace/redhat-operators-v5q62","openshift-marketplace/marketplace-operator-79b997595-krl46","openshift-marketplace/certified-operators-s8snw"] Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.364567 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s8snw" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="registry-server" containerID="cri-o://a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" gracePeriod=30 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.365214 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wpqn9" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="registry-server" containerID="cri-o://9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53" gracePeriod=30 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.365312 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" podUID="68158dce-8840-47f8-8dac-37abc28edc74" containerName="marketplace-operator" containerID="cri-o://0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c" gracePeriod=30 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.365556 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vkb7p" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="registry-server" containerID="cri-o://2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" gracePeriod=30 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.365714 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v5q62" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="registry-server" containerID="cri-o://05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27" gracePeriod=30 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.371944 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qxmcq"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.372534 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" containerName="installer" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.372560 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" containerName="installer" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.372774 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" containerName="installer" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.373708 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.375822 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.402319 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qxmcq"] Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.455921 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.455902256 podStartE2EDuration="21.455902256s" podCreationTimestamp="2026-01-27 11:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:25:04.453950687 +0000 UTC m=+283.595548504" watchObservedRunningTime="2026-01-27 11:25:04.455902256 +0000 UTC m=+283.597500033" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.491901 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.491957 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.492065 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgrf9\" (UniqueName: \"kubernetes.io/projected/fc92bcc5-aeca-4736-b861-e6f1540a15d1-kube-api-access-qgrf9\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.509816 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68158dce_8840_47f8_8dac_37abc28edc74.slice/crio-0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5415a9cc_8755_41e6_bd7b_1542339cadc6.slice/crio-9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5415a9cc_8755_41e6_bd7b_1542339cadc6.slice/crio-conmon-9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.569766 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.579355 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.595067 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgrf9\" (UniqueName: \"kubernetes.io/projected/fc92bcc5-aeca-4736-b861-e6f1540a15d1-kube-api-access-qgrf9\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.595213 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.597274 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.597365 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.606644 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.612233 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgrf9\" (UniqueName: \"kubernetes.io/projected/fc92bcc5-aeca-4736-b861-e6f1540a15d1-kube-api-access-qgrf9\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.618357 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.647674 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c is running failed: container process not found" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.647971 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c is running failed: container process not found" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.648189 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c is running failed: container process not found" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.648242 4775 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-s8snw" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="registry-server" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.667187 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.710097 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.718296 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.748678 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.772791 4775 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.781858 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.792176 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.799864 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.841014 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.843921 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 is running failed: container process not found" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.845396 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 is running failed: container process not found" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.846013 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 is running failed: container process not found" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.846093 4775 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-vkb7p" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="registry-server" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.851750 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.870334 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.882331 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.897774 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.902508 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.911894 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.952930 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004240 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content\") pod \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004304 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content\") pod \"5415a9cc-8755-41e6-bd7b-1542339cadc6\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004334 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7htj4\" (UniqueName: \"kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4\") pod \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004363 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmb78\" (UniqueName: \"kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78\") pod \"5415a9cc-8755-41e6-bd7b-1542339cadc6\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004392 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities\") pod \"5415a9cc-8755-41e6-bd7b-1542339cadc6\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004424 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca\") pod \"68158dce-8840-47f8-8dac-37abc28edc74\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.005293 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "68158dce-8840-47f8-8dac-37abc28edc74" (UID: "68158dce-8840-47f8-8dac-37abc28edc74"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.005342 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics\") pod \"68158dce-8840-47f8-8dac-37abc28edc74\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.005402 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities\") pod \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.005549 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities" (OuterVolumeSpecName: "utilities") pod "5415a9cc-8755-41e6-bd7b-1542339cadc6" (UID: "5415a9cc-8755-41e6-bd7b-1542339cadc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.006327 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities" (OuterVolumeSpecName: "utilities") pod "f1ecb76d-1e7c-4889-ab6d-451e8b534308" (UID: "f1ecb76d-1e7c-4889-ab6d-451e8b534308"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.005437 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q8jd\" (UniqueName: \"kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd\") pod \"68158dce-8840-47f8-8dac-37abc28edc74\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.006407 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content\") pod \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.006431 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qsks\" (UniqueName: \"kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks\") pod \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.008481 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78" (OuterVolumeSpecName: "kube-api-access-nmb78") pod "5415a9cc-8755-41e6-bd7b-1542339cadc6" (UID: "5415a9cc-8755-41e6-bd7b-1542339cadc6"). InnerVolumeSpecName "kube-api-access-nmb78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.009314 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "68158dce-8840-47f8-8dac-37abc28edc74" (UID: "68158dce-8840-47f8-8dac-37abc28edc74"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.009782 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd" (OuterVolumeSpecName: "kube-api-access-4q8jd") pod "68158dce-8840-47f8-8dac-37abc28edc74" (UID: "68158dce-8840-47f8-8dac-37abc28edc74"). InnerVolumeSpecName "kube-api-access-4q8jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.010500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4" (OuterVolumeSpecName: "kube-api-access-7htj4") pod "3ae6a7af-e7d7-440b-b7cb-366edba2d44e" (UID: "3ae6a7af-e7d7-440b-b7cb-366edba2d44e"). InnerVolumeSpecName "kube-api-access-7htj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.010582 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks" (OuterVolumeSpecName: "kube-api-access-7qsks") pod "f1ecb76d-1e7c-4889-ab6d-451e8b534308" (UID: "f1ecb76d-1e7c-4889-ab6d-451e8b534308"). InnerVolumeSpecName "kube-api-access-7qsks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.014221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-962qq\" (UniqueName: \"kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq\") pod \"2b487540-88bb-496a-9aff-3f383cdc858b\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.017332 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq" (OuterVolumeSpecName: "kube-api-access-962qq") pod "2b487540-88bb-496a-9aff-3f383cdc858b" (UID: "2b487540-88bb-496a-9aff-3f383cdc858b"). InnerVolumeSpecName "kube-api-access-962qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018477 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities\") pod \"2b487540-88bb-496a-9aff-3f383cdc858b\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018529 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities\") pod \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018557 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content\") pod \"2b487540-88bb-496a-9aff-3f383cdc858b\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018863 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7htj4\" (UniqueName: \"kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018881 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmb78\" (UniqueName: \"kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018891 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018900 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018909 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018919 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018927 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q8jd\" (UniqueName: \"kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018936 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qsks\" (UniqueName: \"kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018944 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-962qq\" (UniqueName: \"kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.020189 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities" (OuterVolumeSpecName: "utilities") pod "3ae6a7af-e7d7-440b-b7cb-366edba2d44e" (UID: "3ae6a7af-e7d7-440b-b7cb-366edba2d44e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.021119 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities" (OuterVolumeSpecName: "utilities") pod "2b487540-88bb-496a-9aff-3f383cdc858b" (UID: "2b487540-88bb-496a-9aff-3f383cdc858b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.029305 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5415a9cc-8755-41e6-bd7b-1542339cadc6" (UID: "5415a9cc-8755-41e6-bd7b-1542339cadc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.038129 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.043870 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.049555 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.067542 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1ecb76d-1e7c-4889-ab6d-451e8b534308" (UID: "f1ecb76d-1e7c-4889-ab6d-451e8b534308"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.078245 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b487540-88bb-496a-9aff-3f383cdc858b" (UID: "2b487540-88bb-496a-9aff-3f383cdc858b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.119858 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.119898 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.119914 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.119929 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.119941 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.130162 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.131739 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ae6a7af-e7d7-440b-b7cb-366edba2d44e" (UID: "3ae6a7af-e7d7-440b-b7cb-366edba2d44e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.182130 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.196710 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qxmcq"] Jan 27 11:25:05 crc kubenswrapper[4775]: W0127 11:25:05.206929 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc92bcc5_aeca_4736_b861_e6f1540a15d1.slice/crio-f81231987c58c930d579cee292107a078099c24faa5841e85fa28f0998310578 WatchSource:0}: Error finding container f81231987c58c930d579cee292107a078099c24faa5841e85fa28f0998310578: Status 404 returned error can't find the container with id f81231987c58c930d579cee292107a078099c24faa5841e85fa28f0998310578 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.220617 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.242803 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.302534 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.325572 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.344659 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.369198 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.456280 4775 generic.go:334] "Generic (PLEG): container finished" podID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerID="9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53" exitCode=0 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.456393 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.456396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerDied","Data":"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.456513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerDied","Data":"f7dc6e40e63c860fc724ef492981f5e211c90e6c7db158d9132d52f25b456767"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.456557 4775 scope.go:117] "RemoveContainer" containerID="9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.460114 4775 generic.go:334] "Generic (PLEG): container finished" podID="68158dce-8840-47f8-8dac-37abc28edc74" containerID="0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c" exitCode=0 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.460249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" event={"ID":"68158dce-8840-47f8-8dac-37abc28edc74","Type":"ContainerDied","Data":"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.460286 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" event={"ID":"68158dce-8840-47f8-8dac-37abc28edc74","Type":"ContainerDied","Data":"139296b53cfcbab11c8831abaf6a0db6d586bb1a2b9f552fe62be0a6c6fbf343"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.461666 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.463579 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" event={"ID":"fc92bcc5-aeca-4736-b861-e6f1540a15d1","Type":"ContainerStarted","Data":"6e2758023dde46428309b13cec59d9e922f01c1d8293042139dc3e97e5fea02d"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.463677 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" event={"ID":"fc92bcc5-aeca-4736-b861-e6f1540a15d1","Type":"ContainerStarted","Data":"f81231987c58c930d579cee292107a078099c24faa5841e85fa28f0998310578"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.464802 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.466677 4775 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qxmcq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.466782 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podUID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.470672 4775 generic.go:334] "Generic (PLEG): container finished" podID="2b487540-88bb-496a-9aff-3f383cdc858b" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" exitCode=0 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.470779 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerDied","Data":"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.470826 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerDied","Data":"1eec3f7497774ba660fe56e1601efacc89958991dbb3752466e04ed907d8b155"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.470953 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.476889 4775 generic.go:334] "Generic (PLEG): container finished" podID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerID="05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27" exitCode=0 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.477128 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.477129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerDied","Data":"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.477255 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerDied","Data":"aada0f1adaa2b58806b9e0dc31f109b054a31ac70cb0eb0272c44c192348a37d"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.480229 4775 generic.go:334] "Generic (PLEG): container finished" podID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" exitCode=0 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.480293 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerDied","Data":"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.480338 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerDied","Data":"5e3718fa7769c29d58e7ea6f7af42eff70181f72f7af0705859deb32581a0268"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.480553 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.483462 4775 scope.go:117] "RemoveContainer" containerID="6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.488065 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.500998 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podStartSLOduration=7.500977385 podStartE2EDuration="7.500977385s" podCreationTimestamp="2026-01-27 11:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:25:05.496902582 +0000 UTC m=+284.638500419" watchObservedRunningTime="2026-01-27 11:25:05.500977385 +0000 UTC m=+284.642575162" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.513619 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.553993 4775 scope.go:117] "RemoveContainer" containerID="b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.566057 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.583725 4775 scope.go:117] "RemoveContainer" containerID="9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.584207 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53\": container with ID starting with 9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53 not found: ID does not exist" containerID="9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.584280 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53"} err="failed to get container status \"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53\": rpc error: code = NotFound desc = could not find container \"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53\": container with ID starting with 9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.585009 4775 scope.go:117] "RemoveContainer" containerID="6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.585510 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042\": container with ID starting with 6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042 not found: ID does not exist" containerID="6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.585551 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042"} err="failed to get container status \"6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042\": rpc error: code = NotFound desc = could not find container \"6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042\": container with ID starting with 6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.585578 4775 scope.go:117] "RemoveContainer" containerID="b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.585857 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e\": container with ID starting with b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e not found: ID does not exist" containerID="b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.585883 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e"} err="failed to get container status \"b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e\": rpc error: code = NotFound desc = could not find container \"b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e\": container with ID starting with b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.585905 4775 scope.go:117] "RemoveContainer" containerID="0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.600141 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v5q62"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.601777 4775 scope.go:117] "RemoveContainer" containerID="0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.602959 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c\": container with ID starting with 0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c not found: ID does not exist" containerID="0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.603013 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c"} err="failed to get container status \"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c\": rpc error: code = NotFound desc = could not find container \"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c\": container with ID starting with 0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.603042 4775 scope.go:117] "RemoveContainer" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.608301 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v5q62"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.614420 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-krl46"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.620542 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-krl46"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.629030 4775 scope.go:117] "RemoveContainer" containerID="af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.630227 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkb7p"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.638208 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vkb7p"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.646397 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.651490 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.653917 4775 scope.go:117] "RemoveContainer" containerID="d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.654890 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s8snw"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.657645 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.658369 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s8snw"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.669789 4775 scope.go:117] "RemoveContainer" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.670287 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c\": container with ID starting with a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c not found: ID does not exist" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.670327 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c"} err="failed to get container status \"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c\": rpc error: code = NotFound desc = could not find container \"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c\": container with ID starting with a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.670356 4775 scope.go:117] "RemoveContainer" containerID="af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.670651 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6\": container with ID starting with af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6 not found: ID does not exist" containerID="af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.670677 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6"} err="failed to get container status \"af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6\": rpc error: code = NotFound desc = could not find container \"af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6\": container with ID starting with af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.670692 4775 scope.go:117] "RemoveContainer" containerID="d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.670949 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05\": container with ID starting with d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05 not found: ID does not exist" containerID="d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.670993 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05"} err="failed to get container status \"d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05\": rpc error: code = NotFound desc = could not find container \"d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05\": container with ID starting with d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.671010 4775 scope.go:117] "RemoveContainer" containerID="05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.686305 4775 scope.go:117] "RemoveContainer" containerID="5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.713919 4775 scope.go:117] "RemoveContainer" containerID="d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.721041 4775 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.721276 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d" gracePeriod=5 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.726804 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.752540 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" path="/var/lib/kubelet/pods/2b487540-88bb-496a-9aff-3f383cdc858b/volumes" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.753412 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" path="/var/lib/kubelet/pods/3ae6a7af-e7d7-440b-b7cb-366edba2d44e/volumes" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.754131 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" path="/var/lib/kubelet/pods/5415a9cc-8755-41e6-bd7b-1542339cadc6/volumes" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.755221 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68158dce-8840-47f8-8dac-37abc28edc74" path="/var/lib/kubelet/pods/68158dce-8840-47f8-8dac-37abc28edc74/volumes" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.755658 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" path="/var/lib/kubelet/pods/f1ecb76d-1e7c-4889-ab6d-451e8b534308/volumes" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.794406 4775 scope.go:117] "RemoveContainer" containerID="05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.795369 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27\": container with ID starting with 05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27 not found: ID does not exist" containerID="05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.795424 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27"} err="failed to get container status \"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27\": rpc error: code = NotFound desc = could not find container \"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27\": container with ID starting with 05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.795486 4775 scope.go:117] "RemoveContainer" containerID="5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.796044 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79\": container with ID starting with 5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79 not found: ID does not exist" containerID="5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.796080 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79"} err="failed to get container status \"5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79\": rpc error: code = NotFound desc = could not find container \"5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79\": container with ID starting with 5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.796107 4775 scope.go:117] "RemoveContainer" containerID="d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.796528 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196\": container with ID starting with d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196 not found: ID does not exist" containerID="d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.796554 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196"} err="failed to get container status \"d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196\": rpc error: code = NotFound desc = could not find container \"d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196\": container with ID starting with d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.796570 4775 scope.go:117] "RemoveContainer" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.811871 4775 scope.go:117] "RemoveContainer" containerID="0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.825731 4775 scope.go:117] "RemoveContainer" containerID="575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.838207 4775 scope.go:117] "RemoveContainer" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.838745 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1\": container with ID starting with 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 not found: ID does not exist" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.838826 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1"} err="failed to get container status \"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1\": rpc error: code = NotFound desc = could not find container \"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1\": container with ID starting with 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.838873 4775 scope.go:117] "RemoveContainer" containerID="0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.839283 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045\": container with ID starting with 0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045 not found: ID does not exist" containerID="0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.839364 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045"} err="failed to get container status \"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045\": rpc error: code = NotFound desc = could not find container \"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045\": container with ID starting with 0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.839420 4775 scope.go:117] "RemoveContainer" containerID="575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.840050 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1\": container with ID starting with 575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1 not found: ID does not exist" containerID="575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.840088 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1"} err="failed to get container status \"575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1\": rpc error: code = NotFound desc = could not find container \"575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1\": container with ID starting with 575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.896912 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.924190 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.996548 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.116309 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.128930 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.185165 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.233674 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.249952 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.286344 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.301658 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.362079 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.423059 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.493608 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/0.log" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.493682 4775 generic.go:334] "Generic (PLEG): container finished" podID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" containerID="6e2758023dde46428309b13cec59d9e922f01c1d8293042139dc3e97e5fea02d" exitCode=1 Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.493846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" event={"ID":"fc92bcc5-aeca-4736-b861-e6f1540a15d1","Type":"ContainerDied","Data":"6e2758023dde46428309b13cec59d9e922f01c1d8293042139dc3e97e5fea02d"} Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.497315 4775 scope.go:117] "RemoveContainer" containerID="6e2758023dde46428309b13cec59d9e922f01c1d8293042139dc3e97e5fea02d" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.735070 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.809417 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.848483 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.879362 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.003227 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.150868 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.200971 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.272831 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.281948 4775 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.421123 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.455919 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.482924 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.528142 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/1.log" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.528950 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/0.log" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.529022 4775 generic.go:334] "Generic (PLEG): container finished" podID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" exitCode=1 Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.529060 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" event={"ID":"fc92bcc5-aeca-4736-b861-e6f1540a15d1","Type":"ContainerDied","Data":"f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320"} Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.529106 4775 scope.go:117] "RemoveContainer" containerID="6e2758023dde46428309b13cec59d9e922f01c1d8293042139dc3e97e5fea02d" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.532997 4775 scope.go:117] "RemoveContainer" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" Jan 27 11:25:07 crc kubenswrapper[4775]: E0127 11:25:07.534823 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-qxmcq_openshift-marketplace(fc92bcc5-aeca-4736-b861-e6f1540a15d1)\"" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podUID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.578111 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.661549 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.817976 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.819193 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.838794 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.878155 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.958054 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.108961 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.119805 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.341675 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.427790 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.536143 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/1.log" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.536618 4775 scope.go:117] "RemoveContainer" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" Jan 27 11:25:08 crc kubenswrapper[4775]: E0127 11:25:08.536852 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-qxmcq_openshift-marketplace(fc92bcc5-aeca-4736-b861-e6f1540a15d1)\"" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podUID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.749843 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.792888 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.845095 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 11:25:09 crc kubenswrapper[4775]: I0127 11:25:09.397085 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 11:25:09 crc kubenswrapper[4775]: I0127 11:25:09.399798 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 11:25:09 crc kubenswrapper[4775]: I0127 11:25:09.667625 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 11:25:09 crc kubenswrapper[4775]: I0127 11:25:09.991959 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.838292 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.838588 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891392 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891485 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891520 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891538 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891526 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891624 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891631 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891686 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891752 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891834 4775 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891852 4775 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891866 4775 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891876 4775 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.900804 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.993191 4775 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.551894 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.551950 4775 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d" exitCode=137 Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.551997 4775 scope.go:117] "RemoveContainer" containerID="eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.552016 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.568377 4775 scope.go:117] "RemoveContainer" containerID="eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d" Jan 27 11:25:11 crc kubenswrapper[4775]: E0127 11:25:11.568770 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d\": container with ID starting with eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d not found: ID does not exist" containerID="eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.568916 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d"} err="failed to get container status \"eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d\": rpc error: code = NotFound desc = could not find container \"eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d\": container with ID starting with eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d not found: ID does not exist" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.749894 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.750121 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.760421 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.760465 4775 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3f309b33-e54b-48eb-a407-d8ac97d77f99" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.763419 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.763475 4775 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3f309b33-e54b-48eb-a407-d8ac97d77f99" Jan 27 11:25:14 crc kubenswrapper[4775]: I0127 11:25:14.719199 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:14 crc kubenswrapper[4775]: I0127 11:25:14.720841 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:14 crc kubenswrapper[4775]: I0127 11:25:14.721505 4775 scope.go:117] "RemoveContainer" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" Jan 27 11:25:14 crc kubenswrapper[4775]: E0127 11:25:14.721801 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-qxmcq_openshift-marketplace(fc92bcc5-aeca-4736-b861-e6f1540a15d1)\"" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podUID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" Jan 27 11:25:15 crc kubenswrapper[4775]: I0127 11:25:15.575251 4775 scope.go:117] "RemoveContainer" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" Jan 27 11:25:15 crc kubenswrapper[4775]: E0127 11:25:15.576096 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-qxmcq_openshift-marketplace(fc92bcc5-aeca-4736-b861-e6f1540a15d1)\"" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podUID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" Jan 27 11:25:21 crc kubenswrapper[4775]: I0127 11:25:21.485069 4775 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 27 11:25:24 crc kubenswrapper[4775]: I0127 11:25:24.171944 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 11:25:25 crc kubenswrapper[4775]: I0127 11:25:25.094938 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 11:25:27 crc kubenswrapper[4775]: I0127 11:25:27.096591 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 11:25:28 crc kubenswrapper[4775]: I0127 11:25:28.745313 4775 scope.go:117] "RemoveContainer" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" Jan 27 11:25:29 crc kubenswrapper[4775]: I0127 11:25:29.088154 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 11:25:29 crc kubenswrapper[4775]: I0127 11:25:29.646765 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/1.log" Jan 27 11:25:29 crc kubenswrapper[4775]: I0127 11:25:29.646815 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" event={"ID":"fc92bcc5-aeca-4736-b861-e6f1540a15d1","Type":"ContainerStarted","Data":"a34f4e43fcb3601e0ea180c4cd3adefda3908a7e136a44c587c94bd71f0e2c86"} Jan 27 11:25:29 crc kubenswrapper[4775]: I0127 11:25:29.647291 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:29 crc kubenswrapper[4775]: I0127 11:25:29.651284 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:33 crc kubenswrapper[4775]: I0127 11:25:33.936892 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 11:25:38 crc kubenswrapper[4775]: I0127 11:25:38.979359 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 11:25:59 crc kubenswrapper[4775]: I0127 11:25:59.518181 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:25:59 crc kubenswrapper[4775]: I0127 11:25:59.518808 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:26:01 crc kubenswrapper[4775]: I0127 11:26:01.860559 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:26:01 crc kubenswrapper[4775]: I0127 11:26:01.861027 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" podUID="e1b6882d-984d-432b-b3df-101a6437371b" containerName="controller-manager" containerID="cri-o://c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e" gracePeriod=30 Jan 27 11:26:01 crc kubenswrapper[4775]: I0127 11:26:01.954517 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:26:01 crc kubenswrapper[4775]: I0127 11:26:01.954809 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerName="route-controller-manager" containerID="cri-o://fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6" gracePeriod=30 Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.219008 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.278174 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.353684 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles\") pod \"e1b6882d-984d-432b-b3df-101a6437371b\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.353743 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljpnn\" (UniqueName: \"kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn\") pod \"e1b6882d-984d-432b-b3df-101a6437371b\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.353807 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config\") pod \"e1b6882d-984d-432b-b3df-101a6437371b\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.353842 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca\") pod \"e1b6882d-984d-432b-b3df-101a6437371b\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.353876 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert\") pod \"e1b6882d-984d-432b-b3df-101a6437371b\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.354662 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca" (OuterVolumeSpecName: "client-ca") pod "e1b6882d-984d-432b-b3df-101a6437371b" (UID: "e1b6882d-984d-432b-b3df-101a6437371b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.354767 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config" (OuterVolumeSpecName: "config") pod "e1b6882d-984d-432b-b3df-101a6437371b" (UID: "e1b6882d-984d-432b-b3df-101a6437371b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.355079 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e1b6882d-984d-432b-b3df-101a6437371b" (UID: "e1b6882d-984d-432b-b3df-101a6437371b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.355134 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.359178 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e1b6882d-984d-432b-b3df-101a6437371b" (UID: "e1b6882d-984d-432b-b3df-101a6437371b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.359966 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn" (OuterVolumeSpecName: "kube-api-access-ljpnn") pod "e1b6882d-984d-432b-b3df-101a6437371b" (UID: "e1b6882d-984d-432b-b3df-101a6437371b"). InnerVolumeSpecName "kube-api-access-ljpnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456208 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config\") pod \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456261 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxf7k\" (UniqueName: \"kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k\") pod \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert\") pod \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456376 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca\") pod \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456642 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456655 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456664 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456674 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljpnn\" (UniqueName: \"kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.457165 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca" (OuterVolumeSpecName: "client-ca") pod "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" (UID: "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.457201 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config" (OuterVolumeSpecName: "config") pod "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" (UID: "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.459485 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k" (OuterVolumeSpecName: "kube-api-access-gxf7k") pod "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" (UID: "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb"). InnerVolumeSpecName "kube-api-access-gxf7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.461236 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" (UID: "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.557609 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.557646 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxf7k\" (UniqueName: \"kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.557660 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.557669 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646273 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7744f4db6d-kv9sd"] Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646484 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646497 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646507 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646512 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646521 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646527 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646537 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerName="route-controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646542 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerName="route-controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646549 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646554 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646564 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646570 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646579 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646585 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646594 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646599 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646609 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646614 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646623 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646628 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646635 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646641 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646651 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646660 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646669 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b6882d-984d-432b-b3df-101a6437371b" containerName="controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646675 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b6882d-984d-432b-b3df-101a6437371b" containerName="controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646683 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646691 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646700 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646708 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646718 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68158dce-8840-47f8-8dac-37abc28edc74" containerName="marketplace-operator" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646726 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="68158dce-8840-47f8-8dac-37abc28edc74" containerName="marketplace-operator" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646824 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1b6882d-984d-432b-b3df-101a6437371b" containerName="controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646839 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646851 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646861 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646874 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646882 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerName="route-controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646894 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="68158dce-8840-47f8-8dac-37abc28edc74" containerName="marketplace-operator" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646904 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.647415 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.660908 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7744f4db6d-kv9sd"] Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.759534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-config\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.759618 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1706f3-485f-4023-aee7-43602de1dafe-serving-cert\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.759643 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-client-ca\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.759674 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq9xz\" (UniqueName: \"kubernetes.io/projected/7a1706f3-485f-4023-aee7-43602de1dafe-kube-api-access-mq9xz\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.760335 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-proxy-ca-bundles\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.835964 4775 generic.go:334] "Generic (PLEG): container finished" podID="e1b6882d-984d-432b-b3df-101a6437371b" containerID="c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e" exitCode=0 Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.836041 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" event={"ID":"e1b6882d-984d-432b-b3df-101a6437371b","Type":"ContainerDied","Data":"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e"} Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.836071 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" event={"ID":"e1b6882d-984d-432b-b3df-101a6437371b","Type":"ContainerDied","Data":"f1d7f91efbd16850b79ed6c4723629965776aad4a43a007e3ed55d3f13cef28e"} Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.836092 4775 scope.go:117] "RemoveContainer" containerID="c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.836199 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.839242 4775 generic.go:334] "Generic (PLEG): container finished" podID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerID="fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6" exitCode=0 Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.839310 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" event={"ID":"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb","Type":"ContainerDied","Data":"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6"} Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.839342 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" event={"ID":"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb","Type":"ContainerDied","Data":"ee61d306bb5f6310bfe18fb9eb63cdf67c00e9b26b5cdca100d7222a8e1ec7f1"} Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.839395 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.855903 4775 scope.go:117] "RemoveContainer" containerID="c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.856379 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e\": container with ID starting with c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e not found: ID does not exist" containerID="c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.856433 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e"} err="failed to get container status \"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e\": rpc error: code = NotFound desc = could not find container \"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e\": container with ID starting with c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e not found: ID does not exist" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.856479 4775 scope.go:117] "RemoveContainer" containerID="fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.861980 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-config\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.862038 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1706f3-485f-4023-aee7-43602de1dafe-serving-cert\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.862061 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-client-ca\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.862140 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq9xz\" (UniqueName: \"kubernetes.io/projected/7a1706f3-485f-4023-aee7-43602de1dafe-kube-api-access-mq9xz\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.862187 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-proxy-ca-bundles\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.863668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-client-ca\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.863877 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-config\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.864934 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-proxy-ca-bundles\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.867695 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1706f3-485f-4023-aee7-43602de1dafe-serving-cert\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.872357 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.876148 4775 scope.go:117] "RemoveContainer" containerID="fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.880890 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6\": container with ID starting with fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6 not found: ID does not exist" containerID="fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.880984 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6"} err="failed to get container status \"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6\": rpc error: code = NotFound desc = could not find container \"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6\": container with ID starting with fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6 not found: ID does not exist" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.883226 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq9xz\" (UniqueName: \"kubernetes.io/projected/7a1706f3-485f-4023-aee7-43602de1dafe-kube-api-access-mq9xz\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.889591 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.902461 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.907306 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.967342 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.140022 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7744f4db6d-kv9sd"] Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.602022 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c"] Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.602988 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.605548 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.605804 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.605855 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.606267 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.607072 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.607114 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.616894 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c"] Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.752743 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" path="/var/lib/kubelet/pods/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb/volumes" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.754136 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1b6882d-984d-432b-b3df-101a6437371b" path="/var/lib/kubelet/pods/e1b6882d-984d-432b-b3df-101a6437371b/volumes" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.772490 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ee55e6e-e4e4-4af7-9585-f033b6db6467-serving-cert\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.772563 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-config\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.772708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-client-ca\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.772753 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tms6z\" (UniqueName: \"kubernetes.io/projected/2ee55e6e-e4e4-4af7-9585-f033b6db6467-kube-api-access-tms6z\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.847532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" event={"ID":"7a1706f3-485f-4023-aee7-43602de1dafe","Type":"ContainerStarted","Data":"66694f9a395796a14832316007f29fd012a06140e29b909ee8f4aafeaf542760"} Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.847596 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" event={"ID":"7a1706f3-485f-4023-aee7-43602de1dafe","Type":"ContainerStarted","Data":"fb130040fccc2dedda3fb909f2d67ff304e9ea9b06859e087a77f70e1d20fe9a"} Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.847864 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.857388 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.865019 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" podStartSLOduration=1.864996868 podStartE2EDuration="1.864996868s" podCreationTimestamp="2026-01-27 11:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:26:03.864225426 +0000 UTC m=+343.005823233" watchObservedRunningTime="2026-01-27 11:26:03.864996868 +0000 UTC m=+343.006594645" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.873993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-client-ca\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.874056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tms6z\" (UniqueName: \"kubernetes.io/projected/2ee55e6e-e4e4-4af7-9585-f033b6db6467-kube-api-access-tms6z\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.874992 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ee55e6e-e4e4-4af7-9585-f033b6db6467-serving-cert\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.875217 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-config\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.875253 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-client-ca\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.877232 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-config\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.883309 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ee55e6e-e4e4-4af7-9585-f033b6db6467-serving-cert\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.891894 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tms6z\" (UniqueName: \"kubernetes.io/projected/2ee55e6e-e4e4-4af7-9585-f033b6db6467-kube-api-access-tms6z\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.953126 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:04 crc kubenswrapper[4775]: I0127 11:26:04.336196 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c"] Jan 27 11:26:04 crc kubenswrapper[4775]: I0127 11:26:04.858667 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" event={"ID":"2ee55e6e-e4e4-4af7-9585-f033b6db6467","Type":"ContainerStarted","Data":"535565bc215fc2416402a17e27a9411f7a70f23aa78bb5a5f6f062cc7581e0e3"} Jan 27 11:26:04 crc kubenswrapper[4775]: I0127 11:26:04.859085 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" event={"ID":"2ee55e6e-e4e4-4af7-9585-f033b6db6467","Type":"ContainerStarted","Data":"e2332df7f9c512bfc49faf3e7a13558997430836ccac094d607d01ccdd0283cb"} Jan 27 11:26:04 crc kubenswrapper[4775]: I0127 11:26:04.881202 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" podStartSLOduration=3.8811743 podStartE2EDuration="3.8811743s" podCreationTimestamp="2026-01-27 11:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:26:04.880263246 +0000 UTC m=+344.021861023" watchObservedRunningTime="2026-01-27 11:26:04.8811743 +0000 UTC m=+344.022772097" Jan 27 11:26:05 crc kubenswrapper[4775]: I0127 11:26:05.863957 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:05 crc kubenswrapper[4775]: I0127 11:26:05.869779 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.910975 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-klf7d"] Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.913359 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.916673 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.920956 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-klf7d"] Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.955288 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-catalog-content\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.955331 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-utilities\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.955352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vqpb\" (UniqueName: \"kubernetes.io/projected/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-kube-api-access-5vqpb\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.055993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-utilities\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.056300 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqpb\" (UniqueName: \"kubernetes.io/projected/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-kube-api-access-5vqpb\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.056536 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-catalog-content\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.057437 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-utilities\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.057739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-catalog-content\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.080147 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vqpb\" (UniqueName: \"kubernetes.io/projected/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-kube-api-access-5vqpb\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.235088 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.503307 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xbvgj"] Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.511077 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.519827 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.521819 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xbvgj"] Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.694428 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-utilities\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.694767 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-catalog-content\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.694914 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlx46\" (UniqueName: \"kubernetes.io/projected/9db1a996-ad2f-460c-9d8d-cacc63c4924d-kube-api-access-qlx46\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.706483 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-klf7d"] Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.795925 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-utilities\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.796488 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-catalog-content\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.796542 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-utilities\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.797152 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-catalog-content\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.797342 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlx46\" (UniqueName: \"kubernetes.io/projected/9db1a996-ad2f-460c-9d8d-cacc63c4924d-kube-api-access-qlx46\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.818070 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlx46\" (UniqueName: \"kubernetes.io/projected/9db1a996-ad2f-460c-9d8d-cacc63c4924d-kube-api-access-qlx46\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.834735 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.843984 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.950704 4775 generic.go:334] "Generic (PLEG): container finished" podID="30eb115d-82ef-4c37-8cf4-4f2945ad86c1" containerID="d421b49a5cb7396e4023242da08bce2682bc2eff68fa7ea941ad8a12eaa85899" exitCode=0 Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.950811 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klf7d" event={"ID":"30eb115d-82ef-4c37-8cf4-4f2945ad86c1","Type":"ContainerDied","Data":"d421b49a5cb7396e4023242da08bce2682bc2eff68fa7ea941ad8a12eaa85899"} Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.950927 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klf7d" event={"ID":"30eb115d-82ef-4c37-8cf4-4f2945ad86c1","Type":"ContainerStarted","Data":"833c5e147fd7fffc42baff4a223b6090d5f0d820cd757247d85e38a51b1ba790"} Jan 27 11:26:22 crc kubenswrapper[4775]: I0127 11:26:22.242695 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xbvgj"] Jan 27 11:26:22 crc kubenswrapper[4775]: I0127 11:26:22.961953 4775 generic.go:334] "Generic (PLEG): container finished" podID="9db1a996-ad2f-460c-9d8d-cacc63c4924d" containerID="178a32703074e3fcf3b7fb9e371a5571616a222628b1e8b4d4d82ba09bb27c0b" exitCode=0 Jan 27 11:26:22 crc kubenswrapper[4775]: I0127 11:26:22.961998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbvgj" event={"ID":"9db1a996-ad2f-460c-9d8d-cacc63c4924d","Type":"ContainerDied","Data":"178a32703074e3fcf3b7fb9e371a5571616a222628b1e8b4d4d82ba09bb27c0b"} Jan 27 11:26:22 crc kubenswrapper[4775]: I0127 11:26:22.962326 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbvgj" event={"ID":"9db1a996-ad2f-460c-9d8d-cacc63c4924d","Type":"ContainerStarted","Data":"8873508af937bdb58c2e53a1ae67ca35b860075c1bcfbb39a873af7354116971"} Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.300760 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-87qp8"] Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.301689 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.304501 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.323034 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-87qp8"] Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.418670 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzrnt\" (UniqueName: \"kubernetes.io/projected/c6ef80c4-f4f3-4ba1-b98e-63738725009d-kube-api-access-vzrnt\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.418740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-utilities\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.419084 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-catalog-content\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.520184 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-catalog-content\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.520244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzrnt\" (UniqueName: \"kubernetes.io/projected/c6ef80c4-f4f3-4ba1-b98e-63738725009d-kube-api-access-vzrnt\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.520277 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-utilities\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.520826 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-utilities\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.521091 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-catalog-content\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.545664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzrnt\" (UniqueName: \"kubernetes.io/projected/c6ef80c4-f4f3-4ba1-b98e-63738725009d-kube-api-access-vzrnt\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.618329 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.910672 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5mgmj"] Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.912242 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.914582 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.918359 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5mgmj"] Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.968739 4775 generic.go:334] "Generic (PLEG): container finished" podID="9db1a996-ad2f-460c-9d8d-cacc63c4924d" containerID="7cc05d44e18420d0e038bb2da39fac51d372552f66c7912faf9f6eafdeb37172" exitCode=0 Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.968810 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbvgj" event={"ID":"9db1a996-ad2f-460c-9d8d-cacc63c4924d","Type":"ContainerDied","Data":"7cc05d44e18420d0e038bb2da39fac51d372552f66c7912faf9f6eafdeb37172"} Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.970495 4775 generic.go:334] "Generic (PLEG): container finished" podID="30eb115d-82ef-4c37-8cf4-4f2945ad86c1" containerID="76f3b2ac0d75f15ba59659ec2b0353e1c9e72bb8399df264a2d66cc5e85ed7f0" exitCode=0 Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.970532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klf7d" event={"ID":"30eb115d-82ef-4c37-8cf4-4f2945ad86c1","Type":"ContainerDied","Data":"76f3b2ac0d75f15ba59659ec2b0353e1c9e72bb8399df264a2d66cc5e85ed7f0"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.030041 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-utilities\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.030109 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np2n4\" (UniqueName: \"kubernetes.io/projected/b55d8922-b4e4-4162-acbe-4294c4746204-kube-api-access-np2n4\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.030369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-catalog-content\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.034557 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-87qp8"] Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.131472 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-catalog-content\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.131541 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-utilities\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.131580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np2n4\" (UniqueName: \"kubernetes.io/projected/b55d8922-b4e4-4162-acbe-4294c4746204-kube-api-access-np2n4\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.131995 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-utilities\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.132213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-catalog-content\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.151387 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np2n4\" (UniqueName: \"kubernetes.io/projected/b55d8922-b4e4-4162-acbe-4294c4746204-kube-api-access-np2n4\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.240282 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.627366 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5mgmj"] Jan 27 11:26:24 crc kubenswrapper[4775]: W0127 11:26:24.632241 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb55d8922_b4e4_4162_acbe_4294c4746204.slice/crio-a96976e9917baef146251a37ceee6fa25e88f6995d20b06a3cd4c60449af18d4 WatchSource:0}: Error finding container a96976e9917baef146251a37ceee6fa25e88f6995d20b06a3cd4c60449af18d4: Status 404 returned error can't find the container with id a96976e9917baef146251a37ceee6fa25e88f6995d20b06a3cd4c60449af18d4 Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.976420 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klf7d" event={"ID":"30eb115d-82ef-4c37-8cf4-4f2945ad86c1","Type":"ContainerStarted","Data":"d447e35293d37a3ba0d59c56f818a959d8bd43118847daa838d2007a0d225ec1"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.978392 4775 generic.go:334] "Generic (PLEG): container finished" podID="c6ef80c4-f4f3-4ba1-b98e-63738725009d" containerID="84c0689c056eae572aca7363d8bfb6f22824af6abde73b867420fd49d09493a1" exitCode=0 Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.978474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qp8" event={"ID":"c6ef80c4-f4f3-4ba1-b98e-63738725009d","Type":"ContainerDied","Data":"84c0689c056eae572aca7363d8bfb6f22824af6abde73b867420fd49d09493a1"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.978504 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qp8" event={"ID":"c6ef80c4-f4f3-4ba1-b98e-63738725009d","Type":"ContainerStarted","Data":"89bcc0e9f79ab7eb0e16ab77a5809ef59247e944cba96054332e58ad5cb2f568"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.981378 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbvgj" event={"ID":"9db1a996-ad2f-460c-9d8d-cacc63c4924d","Type":"ContainerStarted","Data":"8a84ae7c295fccb6d2e8d3f57355dfb9ce7579c64e403d106e552945c44b76ec"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.983080 4775 generic.go:334] "Generic (PLEG): container finished" podID="b55d8922-b4e4-4162-acbe-4294c4746204" containerID="1c3a3775845760503a5ad415560044cd02cceff592c3c5f46e29e42cc0b78917" exitCode=0 Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.983111 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mgmj" event={"ID":"b55d8922-b4e4-4162-acbe-4294c4746204","Type":"ContainerDied","Data":"1c3a3775845760503a5ad415560044cd02cceff592c3c5f46e29e42cc0b78917"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.983126 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mgmj" event={"ID":"b55d8922-b4e4-4162-acbe-4294c4746204","Type":"ContainerStarted","Data":"a96976e9917baef146251a37ceee6fa25e88f6995d20b06a3cd4c60449af18d4"} Jan 27 11:26:25 crc kubenswrapper[4775]: I0127 11:26:25.000031 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-klf7d" podStartSLOduration=2.528553743 podStartE2EDuration="5.000015723s" podCreationTimestamp="2026-01-27 11:26:20 +0000 UTC" firstStartedPulling="2026-01-27 11:26:21.953569779 +0000 UTC m=+361.095167546" lastFinishedPulling="2026-01-27 11:26:24.425031739 +0000 UTC m=+363.566629526" observedRunningTime="2026-01-27 11:26:24.997773432 +0000 UTC m=+364.139371229" watchObservedRunningTime="2026-01-27 11:26:25.000015723 +0000 UTC m=+364.141613510" Jan 27 11:26:25 crc kubenswrapper[4775]: I0127 11:26:25.037882 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xbvgj" podStartSLOduration=2.6093850549999997 podStartE2EDuration="4.03785964s" podCreationTimestamp="2026-01-27 11:26:21 +0000 UTC" firstStartedPulling="2026-01-27 11:26:22.963595043 +0000 UTC m=+362.105192830" lastFinishedPulling="2026-01-27 11:26:24.392069638 +0000 UTC m=+363.533667415" observedRunningTime="2026-01-27 11:26:25.036562294 +0000 UTC m=+364.178160071" watchObservedRunningTime="2026-01-27 11:26:25.03785964 +0000 UTC m=+364.179457417" Jan 27 11:26:25 crc kubenswrapper[4775]: I0127 11:26:25.990487 4775 generic.go:334] "Generic (PLEG): container finished" podID="b55d8922-b4e4-4162-acbe-4294c4746204" containerID="6b59a05568990621ed5774ab56c43518dc693b1ac996548749222d0c3b8c40c0" exitCode=0 Jan 27 11:26:25 crc kubenswrapper[4775]: I0127 11:26:25.990580 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mgmj" event={"ID":"b55d8922-b4e4-4162-acbe-4294c4746204","Type":"ContainerDied","Data":"6b59a05568990621ed5774ab56c43518dc693b1ac996548749222d0c3b8c40c0"} Jan 27 11:26:25 crc kubenswrapper[4775]: I0127 11:26:25.994529 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qp8" event={"ID":"c6ef80c4-f4f3-4ba1-b98e-63738725009d","Type":"ContainerStarted","Data":"a840ef808ea66d84b114c859330ee32b809535f32fe824249a97d8b9e00d2bd9"} Jan 27 11:26:27 crc kubenswrapper[4775]: I0127 11:26:27.001585 4775 generic.go:334] "Generic (PLEG): container finished" podID="c6ef80c4-f4f3-4ba1-b98e-63738725009d" containerID="a840ef808ea66d84b114c859330ee32b809535f32fe824249a97d8b9e00d2bd9" exitCode=0 Jan 27 11:26:27 crc kubenswrapper[4775]: I0127 11:26:27.001715 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qp8" event={"ID":"c6ef80c4-f4f3-4ba1-b98e-63738725009d","Type":"ContainerDied","Data":"a840ef808ea66d84b114c859330ee32b809535f32fe824249a97d8b9e00d2bd9"} Jan 27 11:26:27 crc kubenswrapper[4775]: I0127 11:26:27.004873 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mgmj" event={"ID":"b55d8922-b4e4-4162-acbe-4294c4746204","Type":"ContainerStarted","Data":"c9e399f257b5d94e8b7181d8eadc767becdc80cda89752762b5cb2993685e8cd"} Jan 27 11:26:27 crc kubenswrapper[4775]: I0127 11:26:27.047184 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5mgmj" podStartSLOduration=2.579689559 podStartE2EDuration="4.047156581s" podCreationTimestamp="2026-01-27 11:26:23 +0000 UTC" firstStartedPulling="2026-01-27 11:26:24.984053063 +0000 UTC m=+364.125650840" lastFinishedPulling="2026-01-27 11:26:26.451520085 +0000 UTC m=+365.593117862" observedRunningTime="2026-01-27 11:26:27.041413551 +0000 UTC m=+366.183011328" watchObservedRunningTime="2026-01-27 11:26:27.047156581 +0000 UTC m=+366.188754358" Jan 27 11:26:28 crc kubenswrapper[4775]: I0127 11:26:28.010981 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qp8" event={"ID":"c6ef80c4-f4f3-4ba1-b98e-63738725009d","Type":"ContainerStarted","Data":"2223891d8828dce1d4d66852bc57cdea0e57d36d0f846608cc0b5785742e81b7"} Jan 27 11:26:28 crc kubenswrapper[4775]: I0127 11:26:28.036962 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-87qp8" podStartSLOduration=2.562585335 podStartE2EDuration="5.036919934s" podCreationTimestamp="2026-01-27 11:26:23 +0000 UTC" firstStartedPulling="2026-01-27 11:26:24.979728483 +0000 UTC m=+364.121326260" lastFinishedPulling="2026-01-27 11:26:27.454063082 +0000 UTC m=+366.595660859" observedRunningTime="2026-01-27 11:26:28.03133245 +0000 UTC m=+367.172930257" watchObservedRunningTime="2026-01-27 11:26:28.036919934 +0000 UTC m=+367.178517731" Jan 27 11:26:29 crc kubenswrapper[4775]: I0127 11:26:29.518235 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:26:29 crc kubenswrapper[4775]: I0127 11:26:29.518621 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.236044 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.236490 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.283464 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.845840 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.845901 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.888254 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:32 crc kubenswrapper[4775]: I0127 11:26:32.071631 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:32 crc kubenswrapper[4775]: I0127 11:26:32.082878 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:33 crc kubenswrapper[4775]: I0127 11:26:33.619658 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:33 crc kubenswrapper[4775]: I0127 11:26:33.620060 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:34 crc kubenswrapper[4775]: I0127 11:26:34.240815 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:34 crc kubenswrapper[4775]: I0127 11:26:34.241372 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:34 crc kubenswrapper[4775]: I0127 11:26:34.285136 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:34 crc kubenswrapper[4775]: I0127 11:26:34.687749 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-87qp8" podUID="c6ef80c4-f4f3-4ba1-b98e-63738725009d" containerName="registry-server" probeResult="failure" output=< Jan 27 11:26:34 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:26:34 crc kubenswrapper[4775]: > Jan 27 11:26:35 crc kubenswrapper[4775]: I0127 11:26:35.095274 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:41 crc kubenswrapper[4775]: I0127 11:26:41.941386 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jh4wp"] Jan 27 11:26:41 crc kubenswrapper[4775]: I0127 11:26:41.944257 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.025901 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jh4wp"] Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065285 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-certificates\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065355 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-tls\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065388 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065481 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-trusted-ca\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c44wh\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-kube-api-access-c44wh\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065545 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065592 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065647 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-bound-sa-token\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.105524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167097 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-trusted-ca\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167290 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c44wh\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-kube-api-access-c44wh\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167378 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167474 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-bound-sa-token\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167575 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-certificates\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167643 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-tls\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167709 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.168094 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.168331 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-trusted-ca\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.168649 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-certificates\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.173078 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.179526 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-tls\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.182164 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c44wh\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-kube-api-access-c44wh\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.182912 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-bound-sa-token\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.260625 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.700177 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jh4wp"] Jan 27 11:26:42 crc kubenswrapper[4775]: W0127 11:26:42.706766 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee2a4899_c4ef_40a2_aeac_2596fdc1b282.slice/crio-8ccbbfd68ad0ce753ff006033bddbb216e49e1b3084397cd518e4240e8d1f883 WatchSource:0}: Error finding container 8ccbbfd68ad0ce753ff006033bddbb216e49e1b3084397cd518e4240e8d1f883: Status 404 returned error can't find the container with id 8ccbbfd68ad0ce753ff006033bddbb216e49e1b3084397cd518e4240e8d1f883 Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.090292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" event={"ID":"ee2a4899-c4ef-40a2-aeac-2596fdc1b282","Type":"ContainerStarted","Data":"09eae0822eb6c795fdc683be48e277bc5e9c6501fad209398cc833bc2e5da80a"} Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.090610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" event={"ID":"ee2a4899-c4ef-40a2-aeac-2596fdc1b282","Type":"ContainerStarted","Data":"8ccbbfd68ad0ce753ff006033bddbb216e49e1b3084397cd518e4240e8d1f883"} Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.090628 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.110253 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" podStartSLOduration=2.110234508 podStartE2EDuration="2.110234508s" podCreationTimestamp="2026-01-27 11:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:26:43.107634556 +0000 UTC m=+382.249232333" watchObservedRunningTime="2026-01-27 11:26:43.110234508 +0000 UTC m=+382.251832285" Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.673557 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.716070 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:59 crc kubenswrapper[4775]: I0127 11:26:59.518091 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:26:59 crc kubenswrapper[4775]: I0127 11:26:59.518905 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:26:59 crc kubenswrapper[4775]: I0127 11:26:59.518978 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:26:59 crc kubenswrapper[4775]: I0127 11:26:59.520179 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:26:59 crc kubenswrapper[4775]: I0127 11:26:59.520377 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8" gracePeriod=600 Jan 27 11:27:00 crc kubenswrapper[4775]: I0127 11:27:00.195362 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8" exitCode=0 Jan 27 11:27:00 crc kubenswrapper[4775]: I0127 11:27:00.195503 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8"} Jan 27 11:27:00 crc kubenswrapper[4775]: I0127 11:27:00.195645 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d"} Jan 27 11:27:00 crc kubenswrapper[4775]: I0127 11:27:00.195667 4775 scope.go:117] "RemoveContainer" containerID="e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4" Jan 27 11:27:02 crc kubenswrapper[4775]: I0127 11:27:02.273883 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:27:02 crc kubenswrapper[4775]: I0127 11:27:02.354042 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.422685 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" podUID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" containerName="registry" containerID="cri-o://666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994" gracePeriod=30 Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.864392 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.958956 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959330 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959406 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959514 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959594 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959632 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959655 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-629ps\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.960156 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.960595 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.961004 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.964940 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.965274 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps" (OuterVolumeSpecName: "kube-api-access-629ps") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "kube-api-access-629ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.966044 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.978038 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.983381 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.992213 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061205 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061238 4775 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061249 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-629ps\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061258 4775 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061267 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061275 4775 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061286 4775 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.381440 4775 generic.go:334] "Generic (PLEG): container finished" podID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" containerID="666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994" exitCode=0 Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.381538 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" event={"ID":"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5","Type":"ContainerDied","Data":"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994"} Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.381579 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" event={"ID":"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5","Type":"ContainerDied","Data":"8d37a2d435548adc351dbcf45235ea8b83864719085f8dffa0da9c361fa7f477"} Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.381610 4775 scope.go:117] "RemoveContainer" containerID="666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.381769 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.408099 4775 scope.go:117] "RemoveContainer" containerID="666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994" Jan 27 11:27:28 crc kubenswrapper[4775]: E0127 11:27:28.408795 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994\": container with ID starting with 666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994 not found: ID does not exist" containerID="666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.408974 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994"} err="failed to get container status \"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994\": rpc error: code = NotFound desc = could not find container \"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994\": container with ID starting with 666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994 not found: ID does not exist" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.440290 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.448282 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:27:29 crc kubenswrapper[4775]: I0127 11:27:29.754639 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" path="/var/lib/kubelet/pods/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5/volumes" Jan 27 11:28:59 crc kubenswrapper[4775]: I0127 11:28:59.517982 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:28:59 crc kubenswrapper[4775]: I0127 11:28:59.518674 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:29:29 crc kubenswrapper[4775]: I0127 11:29:29.518205 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:29:29 crc kubenswrapper[4775]: I0127 11:29:29.519023 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:29:59 crc kubenswrapper[4775]: I0127 11:29:59.517589 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:29:59 crc kubenswrapper[4775]: I0127 11:29:59.518297 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:29:59 crc kubenswrapper[4775]: I0127 11:29:59.518354 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:29:59 crc kubenswrapper[4775]: I0127 11:29:59.519349 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:29:59 crc kubenswrapper[4775]: I0127 11:29:59.519488 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d" gracePeriod=600 Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.214729 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj"] Jan 27 11:30:00 crc kubenswrapper[4775]: E0127 11:30:00.214986 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" containerName="registry" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.215001 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" containerName="registry" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.215158 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" containerName="registry" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.215677 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.218592 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.219502 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.240523 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj"] Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.332995 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.333040 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf2x6\" (UniqueName: \"kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.333073 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.353482 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d" exitCode=0 Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.353532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d"} Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.353570 4775 scope.go:117] "RemoveContainer" containerID="b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.434642 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.434691 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf2x6\" (UniqueName: \"kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.434723 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.437944 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.441098 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.454557 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf2x6\" (UniqueName: \"kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.549608 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.782907 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj"] Jan 27 11:30:00 crc kubenswrapper[4775]: W0127 11:30:00.788893 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fb6e2d5_5884_4a3b_84a1_88a5ee052da9.slice/crio-5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a WatchSource:0}: Error finding container 5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a: Status 404 returned error can't find the container with id 5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a Jan 27 11:30:01 crc kubenswrapper[4775]: I0127 11:30:01.363141 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad"} Jan 27 11:30:01 crc kubenswrapper[4775]: I0127 11:30:01.365574 4775 generic.go:334] "Generic (PLEG): container finished" podID="2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" containerID="0f3580828c538a1fd2620d795cca4ebbc4512c90dd73f2436a5638637886ada1" exitCode=0 Jan 27 11:30:01 crc kubenswrapper[4775]: I0127 11:30:01.365632 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" event={"ID":"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9","Type":"ContainerDied","Data":"0f3580828c538a1fd2620d795cca4ebbc4512c90dd73f2436a5638637886ada1"} Jan 27 11:30:01 crc kubenswrapper[4775]: I0127 11:30:01.365810 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" event={"ID":"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9","Type":"ContainerStarted","Data":"5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a"} Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.658772 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.758200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf2x6\" (UniqueName: \"kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6\") pod \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.758384 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume\") pod \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.758946 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume\") pod \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.759435 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume" (OuterVolumeSpecName: "config-volume") pod "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" (UID: "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.765131 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6" (OuterVolumeSpecName: "kube-api-access-zf2x6") pod "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" (UID: "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9"). InnerVolumeSpecName "kube-api-access-zf2x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.765571 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" (UID: "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.861375 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.861442 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf2x6\" (UniqueName: \"kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.861487 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:03 crc kubenswrapper[4775]: I0127 11:30:03.379638 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" event={"ID":"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9","Type":"ContainerDied","Data":"5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a"} Jan 27 11:30:03 crc kubenswrapper[4775]: I0127 11:30:03.379698 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a" Jan 27 11:30:03 crc kubenswrapper[4775]: I0127 11:30:03.379751 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.512533 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-xpr9c"] Jan 27 11:30:24 crc kubenswrapper[4775]: E0127 11:30:24.513214 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" containerName="collect-profiles" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.513231 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" containerName="collect-profiles" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.513356 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" containerName="collect-profiles" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.513828 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-xpr9c" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.516160 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.516240 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xnp7l" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.516331 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.519331 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5w45m"] Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.520075 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.522667 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-99fqq" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.539672 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k"] Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.543416 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.555137 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-9gbvg" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.561707 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5w45m"] Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.573861 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-xpr9c"] Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.578221 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k"] Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.625594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf8cs\" (UniqueName: \"kubernetes.io/projected/ea378b66-945f-4832-b293-59576474b63c-kube-api-access-pf8cs\") pod \"cert-manager-cainjector-cf98fcc89-4sq7k\" (UID: \"ea378b66-945f-4832-b293-59576474b63c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.625664 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mps57\" (UniqueName: \"kubernetes.io/projected/6b64e5cd-1b80-489b-8d69-3ebf7862eb9f-kube-api-access-mps57\") pod \"cert-manager-858654f9db-xpr9c\" (UID: \"6b64e5cd-1b80-489b-8d69-3ebf7862eb9f\") " pod="cert-manager/cert-manager-858654f9db-xpr9c" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.625733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld6lp\" (UniqueName: \"kubernetes.io/projected/882dbf86-77c4-46a5-a75b-b7b4a70d3ac1-kube-api-access-ld6lp\") pod \"cert-manager-webhook-687f57d79b-5w45m\" (UID: \"882dbf86-77c4-46a5-a75b-b7b4a70d3ac1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.726817 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf8cs\" (UniqueName: \"kubernetes.io/projected/ea378b66-945f-4832-b293-59576474b63c-kube-api-access-pf8cs\") pod \"cert-manager-cainjector-cf98fcc89-4sq7k\" (UID: \"ea378b66-945f-4832-b293-59576474b63c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.726887 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mps57\" (UniqueName: \"kubernetes.io/projected/6b64e5cd-1b80-489b-8d69-3ebf7862eb9f-kube-api-access-mps57\") pod \"cert-manager-858654f9db-xpr9c\" (UID: \"6b64e5cd-1b80-489b-8d69-3ebf7862eb9f\") " pod="cert-manager/cert-manager-858654f9db-xpr9c" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.726954 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld6lp\" (UniqueName: \"kubernetes.io/projected/882dbf86-77c4-46a5-a75b-b7b4a70d3ac1-kube-api-access-ld6lp\") pod \"cert-manager-webhook-687f57d79b-5w45m\" (UID: \"882dbf86-77c4-46a5-a75b-b7b4a70d3ac1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.746904 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf8cs\" (UniqueName: \"kubernetes.io/projected/ea378b66-945f-4832-b293-59576474b63c-kube-api-access-pf8cs\") pod \"cert-manager-cainjector-cf98fcc89-4sq7k\" (UID: \"ea378b66-945f-4832-b293-59576474b63c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.747219 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mps57\" (UniqueName: \"kubernetes.io/projected/6b64e5cd-1b80-489b-8d69-3ebf7862eb9f-kube-api-access-mps57\") pod \"cert-manager-858654f9db-xpr9c\" (UID: \"6b64e5cd-1b80-489b-8d69-3ebf7862eb9f\") " pod="cert-manager/cert-manager-858654f9db-xpr9c" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.749003 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld6lp\" (UniqueName: \"kubernetes.io/projected/882dbf86-77c4-46a5-a75b-b7b4a70d3ac1-kube-api-access-ld6lp\") pod \"cert-manager-webhook-687f57d79b-5w45m\" (UID: \"882dbf86-77c4-46a5-a75b-b7b4a70d3ac1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.831338 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-xpr9c" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.856714 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.871707 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.110074 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5w45m"] Jan 27 11:30:25 crc kubenswrapper[4775]: W0127 11:30:25.127635 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod882dbf86_77c4_46a5_a75b_b7b4a70d3ac1.slice/crio-8e516a48a5283c59824a01da1767452001b6c3437c650aad1db8ee974a5dcec1 WatchSource:0}: Error finding container 8e516a48a5283c59824a01da1767452001b6c3437c650aad1db8ee974a5dcec1: Status 404 returned error can't find the container with id 8e516a48a5283c59824a01da1767452001b6c3437c650aad1db8ee974a5dcec1 Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.134619 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.225711 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-xpr9c"] Jan 27 11:30:25 crc kubenswrapper[4775]: W0127 11:30:25.228390 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b64e5cd_1b80_489b_8d69_3ebf7862eb9f.slice/crio-cc5bfce37ba03eebb6f0fb25d88cd1f29f89ecc9ffd93d0c24f9572eb7c125d1 WatchSource:0}: Error finding container cc5bfce37ba03eebb6f0fb25d88cd1f29f89ecc9ffd93d0c24f9572eb7c125d1: Status 404 returned error can't find the container with id cc5bfce37ba03eebb6f0fb25d88cd1f29f89ecc9ffd93d0c24f9572eb7c125d1 Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.345985 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k"] Jan 27 11:30:25 crc kubenswrapper[4775]: W0127 11:30:25.352348 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea378b66_945f_4832_b293_59576474b63c.slice/crio-8ba3c121ed43ec9490ca405d7f2c29b4658843363deda6eeac4101b849813c0f WatchSource:0}: Error finding container 8ba3c121ed43ec9490ca405d7f2c29b4658843363deda6eeac4101b849813c0f: Status 404 returned error can't find the container with id 8ba3c121ed43ec9490ca405d7f2c29b4658843363deda6eeac4101b849813c0f Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.506794 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" event={"ID":"882dbf86-77c4-46a5-a75b-b7b4a70d3ac1","Type":"ContainerStarted","Data":"8e516a48a5283c59824a01da1767452001b6c3437c650aad1db8ee974a5dcec1"} Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.507519 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" event={"ID":"ea378b66-945f-4832-b293-59576474b63c","Type":"ContainerStarted","Data":"8ba3c121ed43ec9490ca405d7f2c29b4658843363deda6eeac4101b849813c0f"} Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.508869 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-xpr9c" event={"ID":"6b64e5cd-1b80-489b-8d69-3ebf7862eb9f","Type":"ContainerStarted","Data":"cc5bfce37ba03eebb6f0fb25d88cd1f29f89ecc9ffd93d0c24f9572eb7c125d1"} Jan 27 11:30:28 crc kubenswrapper[4775]: I0127 11:30:28.528995 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-xpr9c" event={"ID":"6b64e5cd-1b80-489b-8d69-3ebf7862eb9f","Type":"ContainerStarted","Data":"c225bad7fda744ee0a152305e2c2dde6fc4462a36e0f2f46dda60d4df60d3c37"} Jan 27 11:30:28 crc kubenswrapper[4775]: I0127 11:30:28.532023 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" event={"ID":"882dbf86-77c4-46a5-a75b-b7b4a70d3ac1","Type":"ContainerStarted","Data":"1db14bf2755bae4b1b89b0075cbbaffd66899b5d89d451e3607c287c4f08e3ee"} Jan 27 11:30:28 crc kubenswrapper[4775]: I0127 11:30:28.532150 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:28 crc kubenswrapper[4775]: I0127 11:30:28.544818 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-xpr9c" podStartSLOduration=1.499885586 podStartE2EDuration="4.544793059s" podCreationTimestamp="2026-01-27 11:30:24 +0000 UTC" firstStartedPulling="2026-01-27 11:30:25.231171593 +0000 UTC m=+604.372769370" lastFinishedPulling="2026-01-27 11:30:28.276079026 +0000 UTC m=+607.417676843" observedRunningTime="2026-01-27 11:30:28.541263577 +0000 UTC m=+607.682861354" watchObservedRunningTime="2026-01-27 11:30:28.544793059 +0000 UTC m=+607.686390846" Jan 27 11:30:28 crc kubenswrapper[4775]: I0127 11:30:28.572342 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" podStartSLOduration=1.422662092 podStartE2EDuration="4.572322391s" podCreationTimestamp="2026-01-27 11:30:24 +0000 UTC" firstStartedPulling="2026-01-27 11:30:25.134394056 +0000 UTC m=+604.275991823" lastFinishedPulling="2026-01-27 11:30:28.284054305 +0000 UTC m=+607.425652122" observedRunningTime="2026-01-27 11:30:28.566050996 +0000 UTC m=+607.707648763" watchObservedRunningTime="2026-01-27 11:30:28.572322391 +0000 UTC m=+607.713920178" Jan 27 11:30:29 crc kubenswrapper[4775]: I0127 11:30:29.540663 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" event={"ID":"ea378b66-945f-4832-b293-59576474b63c","Type":"ContainerStarted","Data":"c833c29eb74e627ad9fea30411217a37db682ae9fd5c2c2a4a2c0094511ed59b"} Jan 27 11:30:29 crc kubenswrapper[4775]: I0127 11:30:29.566495 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" podStartSLOduration=1.855251481 podStartE2EDuration="5.566438939s" podCreationTimestamp="2026-01-27 11:30:24 +0000 UTC" firstStartedPulling="2026-01-27 11:30:25.353713755 +0000 UTC m=+604.495311572" lastFinishedPulling="2026-01-27 11:30:29.064901253 +0000 UTC m=+608.206499030" observedRunningTime="2026-01-27 11:30:29.56042176 +0000 UTC m=+608.702019557" watchObservedRunningTime="2026-01-27 11:30:29.566438939 +0000 UTC m=+608.708036746" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.272415 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nzthg"] Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.273937 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-controller" containerID="cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274138 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" containerID="cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274228 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="sbdb" containerID="cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274566 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274609 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="nbdb" containerID="cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274631 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="northd" containerID="cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274649 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-node" containerID="cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.328847 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" containerID="cri-o://fff264ae37c862c92f04505830404488875026a16f9b83753ca7e41d83f2d007" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.574052 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/3.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.574743 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/1.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.577513 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578330 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-controller/0.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578903 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="fff264ae37c862c92f04505830404488875026a16f9b83753ca7e41d83f2d007" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578934 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693" exitCode=143 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578944 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578954 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578963 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578971 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578970 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"fff264ae37c862c92f04505830404488875026a16f9b83753ca7e41d83f2d007"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578980 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579102 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5" exitCode=143 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579042 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579030 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579356 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579371 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579383 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579407 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579419 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"f888bf350c80a3614a432edcc4a4b855273dcb2c8f4a4adedcb465a13b969229"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579431 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f888bf350c80a3614a432edcc4a4b855273dcb2c8f4a4adedcb465a13b969229" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.581324 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/2.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.581866 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/1.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.581909 4775 generic.go:334] "Generic (PLEG): container finished" podID="aba2edc6-0e64-4995-830d-e177919ea13e" containerID="bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09" exitCode=2 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.581934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerDied","Data":"bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.582621 4775 scope.go:117] "RemoveContainer" containerID="bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.582998 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-gm7w4_openshift-multus(aba2edc6-0e64-4995-830d-e177919ea13e)\"" pod="openshift-multus/multus-gm7w4" podUID="aba2edc6-0e64-4995-830d-e177919ea13e" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.611121 4775 scope.go:117] "RemoveContainer" containerID="da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.611999 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\": container with ID starting with aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c not found: ID does not exist" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.612853 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/1.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.617590 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-controller/0.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.618255 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.631205 4775 scope.go:117] "RemoveContainer" containerID="750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673037 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mzqrg"] Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673327 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="nbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673354 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="nbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673374 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673386 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673398 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673412 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673429 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673440 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673524 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673659 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673680 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673693 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673711 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673722 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673741 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-node" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673752 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-node" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673771 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kubecfg-setup" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673783 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kubecfg-setup" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673797 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="sbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673807 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="sbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673825 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673835 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673851 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="northd" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673864 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="northd" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674025 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="nbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674042 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674059 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674070 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674087 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="sbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674103 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674116 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674128 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674142 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674153 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-node" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674171 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674184 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="northd" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.674356 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674370 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.674383 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674394 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674582 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.676857 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755681 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755738 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755768 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755790 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755819 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755846 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755868 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czdm4\" (UniqueName: \"kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755895 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755888 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755921 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755941 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755951 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755975 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755998 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756020 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756043 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756064 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756102 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756134 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756165 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756198 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756340 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756381 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash" (OuterVolumeSpecName: "host-slash") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756413 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-slash\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756442 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/fae72616-e516-4ce6-86b8-b28f14a92939-kube-api-access-6gpt5\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756479 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-systemd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756524 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756527 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-bin\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756569 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756584 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756597 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756615 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756632 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket" (OuterVolumeSpecName: "log-socket") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756633 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756688 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae72616-e516-4ce6-86b8-b28f14a92939-ovn-node-metrics-cert\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756705 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756741 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756756 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log" (OuterVolumeSpecName: "node-log") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756787 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756817 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756879 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756930 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-var-lib-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-kubelet\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757127 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-ovn\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757215 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-config\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757239 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757268 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-systemd-units\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757307 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-netns\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757487 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-etc-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757525 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757572 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-node-log\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757614 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-log-socket\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757670 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-netd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757688 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-script-lib\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-env-overrides\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757783 4775 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757796 4775 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757807 4775 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757818 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757828 4775 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757837 4775 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757847 4775 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757855 4775 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757863 4775 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757871 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757879 4775 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757887 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757897 4775 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.758150 4775 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.758159 4775 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.758167 4775 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.758174 4775 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.761183 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4" (OuterVolumeSpecName: "kube-api-access-czdm4") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "kube-api-access-czdm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.761260 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.768011 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859782 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-ovn\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-config\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859923 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-systemd-units\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859934 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-netns\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-ovn\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859947 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-netns\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860025 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-systemd-units\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-etc-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860120 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-etc-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860125 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-node-log\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860143 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860165 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-log-socket\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860193 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-netd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860211 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-script-lib\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860221 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-log-socket\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860222 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860231 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-env-overrides\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-node-log\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860250 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-netd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-slash\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860413 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/fae72616-e516-4ce6-86b8-b28f14a92939-kube-api-access-6gpt5\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-systemd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860555 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-bin\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860574 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860613 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae72616-e516-4ce6-86b8-b28f14a92939-ovn-node-metrics-cert\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860644 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860664 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-var-lib-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860797 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-kubelet\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860847 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-env-overrides\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860881 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860884 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-kubelet\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860894 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czdm4\" (UniqueName: \"kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860907 4775 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861133 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-script-lib\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861261 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-bin\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861325 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-slash\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-var-lib-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861587 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-config\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-systemd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.863974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae72616-e516-4ce6-86b8-b28f14a92939-ovn-node-metrics-cert\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.879021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/fae72616-e516-4ce6-86b8-b28f14a92939-kube-api-access-6gpt5\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.988798 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.589537 4775 generic.go:334] "Generic (PLEG): container finished" podID="fae72616-e516-4ce6-86b8-b28f14a92939" containerID="d16ee06f4b6448af85e17a5a56c4db31922ee5f3324e04e6e548d036b2cbe3a3" exitCode=0 Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.589621 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerDied","Data":"d16ee06f4b6448af85e17a5a56c4db31922ee5f3324e04e6e548d036b2cbe3a3"} Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.589662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"09cfe9078423751d584eee942954fb930c2d766b5a623b8f7aba105e569f907d"} Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.593701 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/1.log" Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.597208 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-controller/0.log" Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.597764 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.599247 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/2.log" Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.718718 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nzthg"] Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.724639 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nzthg"] Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.752467 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" path="/var/lib/kubelet/pods/7d657d41-09b6-43f2-babb-4cb13a62fd1f/volumes" Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610582 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"89f663a6b0297d93e84b18f60117d31df8f6eb622689e23db150f2704fe5cf7a"} Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610910 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"c461e08ca691633285c1043270a5de4e7e99dcb43dc1b92b3f2a4e5473ec2105"} Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610923 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"189d778b681818ad736a67d73c2ed45c58d46018ab1188ddd77713ee2c8206b0"} Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610932 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"e21d1c3529b62f8826c3112a7e93fc08adf3dfaa1a5f37c6371a0eaf0b4f5a62"} Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610941 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"ba6b34b749ae1647219001268d83b10d62ccde5e3f7e6b5f6ffddd83de26566a"} Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610950 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"5be8408bc0028ddffd2077b92d63b4c221cfe55e52c960653307ec6b72b179bf"} Jan 27 11:30:39 crc kubenswrapper[4775]: I0127 11:30:39.639641 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"65098304a8a1a37d604c34217d9c290cfd3f2d1861a71136999dba5eb846c23b"} Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.658332 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"af117489bcb760cd6b4bade688f366a50c03ae76050e2dd6d0f1b2c7b9c2a49a"} Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.659573 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.659671 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.659762 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.695950 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" podStartSLOduration=7.695932513 podStartE2EDuration="7.695932513s" podCreationTimestamp="2026-01-27 11:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:30:41.691038584 +0000 UTC m=+620.832636451" watchObservedRunningTime="2026-01-27 11:30:41.695932513 +0000 UTC m=+620.837530300" Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.706771 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.712740 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:48 crc kubenswrapper[4775]: I0127 11:30:48.745067 4775 scope.go:117] "RemoveContainer" containerID="bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09" Jan 27 11:30:48 crc kubenswrapper[4775]: E0127 11:30:48.745954 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-gm7w4_openshift-multus(aba2edc6-0e64-4995-830d-e177919ea13e)\"" pod="openshift-multus/multus-gm7w4" podUID="aba2edc6-0e64-4995-830d-e177919ea13e" Jan 27 11:30:59 crc kubenswrapper[4775]: I0127 11:30:59.745737 4775 scope.go:117] "RemoveContainer" containerID="bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09" Jan 27 11:31:00 crc kubenswrapper[4775]: I0127 11:31:00.807779 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/2.log" Jan 27 11:31:00 crc kubenswrapper[4775]: I0127 11:31:00.808445 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerStarted","Data":"35d4395d9d7eb335c205003f382ddb1acd0f675feb8bdae5008fcf9452419a97"} Jan 27 11:31:05 crc kubenswrapper[4775]: I0127 11:31:05.018859 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.423200 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8"] Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.424675 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.427920 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.435490 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8"] Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.487716 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.487798 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.487877 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vfq\" (UniqueName: \"kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.589610 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.589664 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.589706 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vfq\" (UniqueName: \"kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.590490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.590505 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.614933 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vfq\" (UniqueName: \"kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.742262 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.964543 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8"] Jan 27 11:31:14 crc kubenswrapper[4775]: W0127 11:31:14.971913 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod252d02e0_ca7d_405f_8315_3588f55a7b0c.slice/crio-1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106 WatchSource:0}: Error finding container 1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106: Status 404 returned error can't find the container with id 1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106 Jan 27 11:31:15 crc kubenswrapper[4775]: I0127 11:31:15.917473 4775 generic.go:334] "Generic (PLEG): container finished" podID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerID="50003896a68c4b615e5518043da0851529315fd0b9f601839474320686b53035" exitCode=0 Jan 27 11:31:15 crc kubenswrapper[4775]: I0127 11:31:15.917735 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" event={"ID":"252d02e0-ca7d-405f-8315-3588f55a7b0c","Type":"ContainerDied","Data":"50003896a68c4b615e5518043da0851529315fd0b9f601839474320686b53035"} Jan 27 11:31:15 crc kubenswrapper[4775]: I0127 11:31:15.917762 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" event={"ID":"252d02e0-ca7d-405f-8315-3588f55a7b0c","Type":"ContainerStarted","Data":"1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106"} Jan 27 11:31:17 crc kubenswrapper[4775]: I0127 11:31:17.929937 4775 generic.go:334] "Generic (PLEG): container finished" podID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerID="0e989fb340becd147e428c9537634b8d3fbdf14a94f26bb24eb8c7e194acbb5c" exitCode=0 Jan 27 11:31:17 crc kubenswrapper[4775]: I0127 11:31:17.930005 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" event={"ID":"252d02e0-ca7d-405f-8315-3588f55a7b0c","Type":"ContainerDied","Data":"0e989fb340becd147e428c9537634b8d3fbdf14a94f26bb24eb8c7e194acbb5c"} Jan 27 11:31:18 crc kubenswrapper[4775]: I0127 11:31:18.946117 4775 generic.go:334] "Generic (PLEG): container finished" podID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerID="886e6e6985b6672998c5a3f60f6e73d0df864b637b88600dbb8d779ea1634165" exitCode=0 Jan 27 11:31:18 crc kubenswrapper[4775]: I0127 11:31:18.946178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" event={"ID":"252d02e0-ca7d-405f-8315-3588f55a7b0c","Type":"ContainerDied","Data":"886e6e6985b6672998c5a3f60f6e73d0df864b637b88600dbb8d779ea1634165"} Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.194275 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.265010 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vfq\" (UniqueName: \"kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq\") pod \"252d02e0-ca7d-405f-8315-3588f55a7b0c\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.265182 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util\") pod \"252d02e0-ca7d-405f-8315-3588f55a7b0c\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.265229 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle\") pod \"252d02e0-ca7d-405f-8315-3588f55a7b0c\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.265769 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle" (OuterVolumeSpecName: "bundle") pod "252d02e0-ca7d-405f-8315-3588f55a7b0c" (UID: "252d02e0-ca7d-405f-8315-3588f55a7b0c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.271748 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq" (OuterVolumeSpecName: "kube-api-access-68vfq") pod "252d02e0-ca7d-405f-8315-3588f55a7b0c" (UID: "252d02e0-ca7d-405f-8315-3588f55a7b0c"). InnerVolumeSpecName "kube-api-access-68vfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.280923 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util" (OuterVolumeSpecName: "util") pod "252d02e0-ca7d-405f-8315-3588f55a7b0c" (UID: "252d02e0-ca7d-405f-8315-3588f55a7b0c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.367054 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util\") on node \"crc\" DevicePath \"\"" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.367096 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.367106 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vfq\" (UniqueName: \"kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq\") on node \"crc\" DevicePath \"\"" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.960388 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" event={"ID":"252d02e0-ca7d-405f-8315-3588f55a7b0c","Type":"ContainerDied","Data":"1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106"} Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.960498 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.960418 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:21 crc kubenswrapper[4775]: I0127 11:31:21.962925 4775 scope.go:117] "RemoveContainer" containerID="2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04" Jan 27 11:31:21 crc kubenswrapper[4775]: I0127 11:31:21.983311 4775 scope.go:117] "RemoveContainer" containerID="377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333" Jan 27 11:31:21 crc kubenswrapper[4775]: I0127 11:31:21.997099 4775 scope.go:117] "RemoveContainer" containerID="22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.012866 4775 scope.go:117] "RemoveContainer" containerID="627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.028532 4775 scope.go:117] "RemoveContainer" containerID="491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.045963 4775 scope.go:117] "RemoveContainer" containerID="109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.061816 4775 scope.go:117] "RemoveContainer" containerID="fff264ae37c862c92f04505830404488875026a16f9b83753ca7e41d83f2d007" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.078829 4775 scope.go:117] "RemoveContainer" containerID="f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.097980 4775 scope.go:117] "RemoveContainer" containerID="46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.194708 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-znzng"] Jan 27 11:31:23 crc kubenswrapper[4775]: E0127 11:31:23.195384 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="pull" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.195405 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="pull" Jan 27 11:31:23 crc kubenswrapper[4775]: E0127 11:31:23.195429 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="extract" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.195439 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="extract" Jan 27 11:31:23 crc kubenswrapper[4775]: E0127 11:31:23.195473 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="util" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.195485 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="util" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.195649 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="extract" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.196186 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-znzng" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.198224 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rtnxx" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.198236 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.198746 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.203575 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-znzng"] Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.302155 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brvrv\" (UniqueName: \"kubernetes.io/projected/cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f-kube-api-access-brvrv\") pod \"nmstate-operator-646758c888-znzng\" (UID: \"cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f\") " pod="openshift-nmstate/nmstate-operator-646758c888-znzng" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.403499 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brvrv\" (UniqueName: \"kubernetes.io/projected/cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f-kube-api-access-brvrv\") pod \"nmstate-operator-646758c888-znzng\" (UID: \"cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f\") " pod="openshift-nmstate/nmstate-operator-646758c888-znzng" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.429515 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brvrv\" (UniqueName: \"kubernetes.io/projected/cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f-kube-api-access-brvrv\") pod \"nmstate-operator-646758c888-znzng\" (UID: \"cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f\") " pod="openshift-nmstate/nmstate-operator-646758c888-znzng" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.510064 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-znzng" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.764742 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-znzng"] Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.978520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-znzng" event={"ID":"cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f","Type":"ContainerStarted","Data":"1cee26aecbd228ef746e224e9ea667077798e9b9561d4134fa1e12188cc3fc89"} Jan 27 11:31:27 crc kubenswrapper[4775]: I0127 11:31:27.007754 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-znzng" event={"ID":"cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f","Type":"ContainerStarted","Data":"4695d30628f6cd34b24844835fb98d2b4f85c2ef36ce5e32dc9807eb433f189c"} Jan 27 11:31:27 crc kubenswrapper[4775]: I0127 11:31:27.033091 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-znzng" podStartSLOduration=1.553827802 podStartE2EDuration="4.033050328s" podCreationTimestamp="2026-01-27 11:31:23 +0000 UTC" firstStartedPulling="2026-01-27 11:31:23.769095739 +0000 UTC m=+662.910693526" lastFinishedPulling="2026-01-27 11:31:26.248318275 +0000 UTC m=+665.389916052" observedRunningTime="2026-01-27 11:31:27.028908486 +0000 UTC m=+666.170506293" watchObservedRunningTime="2026-01-27 11:31:27.033050328 +0000 UTC m=+666.174648125" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.025784 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-2qhwx"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.026799 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.032259 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-p9dzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.040489 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.041327 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.048486 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.048910 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-2qhwx"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.072335 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.101211 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-4vtwf"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.101990 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.160895 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7vd6\" (UniqueName: \"kubernetes.io/projected/4c84a5ec-b41d-4396-adea-3c9964cc7c59-kube-api-access-b7vd6\") pod \"nmstate-metrics-54757c584b-2qhwx\" (UID: \"4c84a5ec-b41d-4396-adea-3c9964cc7c59\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.161063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg4m2\" (UniqueName: \"kubernetes.io/projected/d9f9feec-ee04-44de-8879-4071243ac6db-kube-api-access-fg4m2\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.161205 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.177878 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.180248 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.185882 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.185911 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4kp2r" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.185951 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.205848 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262715 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7vd6\" (UniqueName: \"kubernetes.io/projected/4c84a5ec-b41d-4396-adea-3c9964cc7c59-kube-api-access-b7vd6\") pod \"nmstate-metrics-54757c584b-2qhwx\" (UID: \"4c84a5ec-b41d-4396-adea-3c9964cc7c59\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-ovs-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: E0127 11:31:28.262773 4775 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262859 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-nmstate-lock\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262917 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-dbus-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: E0127 11:31:28.262953 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair podName:d9f9feec-ee04-44de-8879-4071243ac6db nodeName:}" failed. No retries permitted until 2026-01-27 11:31:28.762917511 +0000 UTC m=+667.904515308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-d9lzh" (UID: "d9f9feec-ee04-44de-8879-4071243ac6db") : secret "openshift-nmstate-webhook" not found Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262998 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8d4\" (UniqueName: \"kubernetes.io/projected/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-kube-api-access-dc8d4\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.263122 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76d9c92d-c012-448b-8ff5-00f10c17c5a7-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.263170 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d9c92d-c012-448b-8ff5-00f10c17c5a7-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.263254 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4m2\" (UniqueName: \"kubernetes.io/projected/d9f9feec-ee04-44de-8879-4071243ac6db-kube-api-access-fg4m2\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.263300 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx6kw\" (UniqueName: \"kubernetes.io/projected/76d9c92d-c012-448b-8ff5-00f10c17c5a7-kube-api-access-sx6kw\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.283201 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7vd6\" (UniqueName: \"kubernetes.io/projected/4c84a5ec-b41d-4396-adea-3c9964cc7c59-kube-api-access-b7vd6\") pod \"nmstate-metrics-54757c584b-2qhwx\" (UID: \"4c84a5ec-b41d-4396-adea-3c9964cc7c59\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.283526 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg4m2\" (UniqueName: \"kubernetes.io/projected/d9f9feec-ee04-44de-8879-4071243ac6db-kube-api-access-fg4m2\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.348360 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367264 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76d9c92d-c012-448b-8ff5-00f10c17c5a7-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367320 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d9c92d-c012-448b-8ff5-00f10c17c5a7-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx6kw\" (UniqueName: \"kubernetes.io/projected/76d9c92d-c012-448b-8ff5-00f10c17c5a7-kube-api-access-sx6kw\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367415 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-ovs-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367470 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-nmstate-lock\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367500 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-dbus-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367532 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8d4\" (UniqueName: \"kubernetes.io/projected/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-kube-api-access-dc8d4\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.368880 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76d9c92d-c012-448b-8ff5-00f10c17c5a7-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.369556 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-ovs-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.370074 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-nmstate-lock\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.370489 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-dbus-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.375267 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d9c92d-c012-448b-8ff5-00f10c17c5a7-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.389281 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-84c44595ff-qwwqd"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.390106 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.390300 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx6kw\" (UniqueName: \"kubernetes.io/projected/76d9c92d-c012-448b-8ff5-00f10c17c5a7-kube-api-access-sx6kw\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.406756 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84c44595ff-qwwqd"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.411113 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8d4\" (UniqueName: \"kubernetes.io/projected/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-kube-api-access-dc8d4\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.415751 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: W0127 11:31:28.447836 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa6cbcb_077f_4ae7_85b2_d79679ef64df.slice/crio-69207126e988dcbb5447577a61d1154f7c4bf8835e6961b5edc47e5c6a4beec4 WatchSource:0}: Error finding container 69207126e988dcbb5447577a61d1154f7c4bf8835e6961b5edc47e5c6a4beec4: Status 404 returned error can't find the container with id 69207126e988dcbb5447577a61d1154f7c4bf8835e6961b5edc47e5c6a4beec4 Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.468543 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-oauth-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.468584 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-service-ca\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.468644 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-trusted-ca-bundle\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.468887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-oauth-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.468941 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfnrj\" (UniqueName: \"kubernetes.io/projected/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-kube-api-access-rfnrj\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.469015 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.469051 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.504001 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.570668 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-oauth-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfnrj\" (UniqueName: \"kubernetes.io/projected/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-kube-api-access-rfnrj\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571057 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571125 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-oauth-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-service-ca\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571208 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-trusted-ca-bundle\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571478 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-oauth-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.572141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.572553 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-trusted-ca-bundle\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.572778 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-service-ca\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.576464 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.577383 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-oauth-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.589871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfnrj\" (UniqueName: \"kubernetes.io/projected/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-kube-api-access-rfnrj\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.675600 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw"] Jan 27 11:31:28 crc kubenswrapper[4775]: W0127 11:31:28.676813 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d9c92d_c012_448b_8ff5_00f10c17c5a7.slice/crio-a867ef9e379e9817fcafffd3f0587d5790800a2b0aa98b4aab073b76c15032e2 WatchSource:0}: Error finding container a867ef9e379e9817fcafffd3f0587d5790800a2b0aa98b4aab073b76c15032e2: Status 404 returned error can't find the container with id a867ef9e379e9817fcafffd3f0587d5790800a2b0aa98b4aab073b76c15032e2 Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.737993 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.764741 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-2qhwx"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.773259 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.781761 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.954188 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84c44595ff-qwwqd"] Jan 27 11:31:28 crc kubenswrapper[4775]: W0127 11:31:28.957734 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfe7bd0c_4325_4a4a_a4e7_fb0d4c990bb7.slice/crio-e16ba07e6f32343049bdd0a8105617df715e989a7707c601e595125f6a36f699 WatchSource:0}: Error finding container e16ba07e6f32343049bdd0a8105617df715e989a7707c601e595125f6a36f699: Status 404 returned error can't find the container with id e16ba07e6f32343049bdd0a8105617df715e989a7707c601e595125f6a36f699 Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.964820 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:29 crc kubenswrapper[4775]: I0127 11:31:29.025648 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84c44595ff-qwwqd" event={"ID":"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7","Type":"ContainerStarted","Data":"e16ba07e6f32343049bdd0a8105617df715e989a7707c601e595125f6a36f699"} Jan 27 11:31:29 crc kubenswrapper[4775]: I0127 11:31:29.027000 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" event={"ID":"4c84a5ec-b41d-4396-adea-3c9964cc7c59","Type":"ContainerStarted","Data":"982053ce13c449a8452182194cbe607fec59cc559be6e06ac08826aa661adc64"} Jan 27 11:31:29 crc kubenswrapper[4775]: I0127 11:31:29.028372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" event={"ID":"76d9c92d-c012-448b-8ff5-00f10c17c5a7","Type":"ContainerStarted","Data":"a867ef9e379e9817fcafffd3f0587d5790800a2b0aa98b4aab073b76c15032e2"} Jan 27 11:31:29 crc kubenswrapper[4775]: I0127 11:31:29.029779 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4vtwf" event={"ID":"0aa6cbcb-077f-4ae7-85b2-d79679ef64df","Type":"ContainerStarted","Data":"69207126e988dcbb5447577a61d1154f7c4bf8835e6961b5edc47e5c6a4beec4"} Jan 27 11:31:29 crc kubenswrapper[4775]: I0127 11:31:29.150127 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh"] Jan 27 11:31:29 crc kubenswrapper[4775]: W0127 11:31:29.164013 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f9feec_ee04_44de_8879_4071243ac6db.slice/crio-7b03fbae930bf1a9c0d11088644d29d93cdd5b84f167e1c9abc5f3927c451fc3 WatchSource:0}: Error finding container 7b03fbae930bf1a9c0d11088644d29d93cdd5b84f167e1c9abc5f3927c451fc3: Status 404 returned error can't find the container with id 7b03fbae930bf1a9c0d11088644d29d93cdd5b84f167e1c9abc5f3927c451fc3 Jan 27 11:31:30 crc kubenswrapper[4775]: I0127 11:31:30.037761 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" event={"ID":"d9f9feec-ee04-44de-8879-4071243ac6db","Type":"ContainerStarted","Data":"7b03fbae930bf1a9c0d11088644d29d93cdd5b84f167e1c9abc5f3927c451fc3"} Jan 27 11:31:30 crc kubenswrapper[4775]: I0127 11:31:30.039303 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84c44595ff-qwwqd" event={"ID":"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7","Type":"ContainerStarted","Data":"c7c2c326647ef2564d34ef361d65662db23fcdeda4d850dd5a72b45fd4f7e386"} Jan 27 11:31:30 crc kubenswrapper[4775]: I0127 11:31:30.056699 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84c44595ff-qwwqd" podStartSLOduration=2.056681216 podStartE2EDuration="2.056681216s" podCreationTimestamp="2026-01-27 11:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:31:30.054897367 +0000 UTC m=+669.196495144" watchObservedRunningTime="2026-01-27 11:31:30.056681216 +0000 UTC m=+669.198278983" Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.051432 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" event={"ID":"4c84a5ec-b41d-4396-adea-3c9964cc7c59","Type":"ContainerStarted","Data":"c9aad9050ef97adc11acc62a5017d1bbad96d5367d230f0ce862f05c7fb52775"} Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.054115 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" event={"ID":"d9f9feec-ee04-44de-8879-4071243ac6db","Type":"ContainerStarted","Data":"ad20f0aecba19368384b3c78e7bc1a28d738c995c83b1c9f0fe81147c8d01b56"} Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.054426 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.056306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" event={"ID":"76d9c92d-c012-448b-8ff5-00f10c17c5a7","Type":"ContainerStarted","Data":"f7c4c875e34078949e7bb169f7b0ed5babe168d72546d17e1821c4a1958894fc"} Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.057903 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4vtwf" event={"ID":"0aa6cbcb-077f-4ae7-85b2-d79679ef64df","Type":"ContainerStarted","Data":"a53f78ac6c47508c2dfea94e941a23e71f0ca9c499b2776a3f7d624db5b2737d"} Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.058047 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.074475 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" podStartSLOduration=2.089719766 podStartE2EDuration="4.074436708s" podCreationTimestamp="2026-01-27 11:31:28 +0000 UTC" firstStartedPulling="2026-01-27 11:31:29.167891359 +0000 UTC m=+668.309489136" lastFinishedPulling="2026-01-27 11:31:31.152608301 +0000 UTC m=+670.294206078" observedRunningTime="2026-01-27 11:31:32.069207966 +0000 UTC m=+671.210805763" watchObservedRunningTime="2026-01-27 11:31:32.074436708 +0000 UTC m=+671.216034495" Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.088993 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-4vtwf" podStartSLOduration=1.405721123 podStartE2EDuration="4.088975464s" podCreationTimestamp="2026-01-27 11:31:28 +0000 UTC" firstStartedPulling="2026-01-27 11:31:28.45020746 +0000 UTC m=+667.591805237" lastFinishedPulling="2026-01-27 11:31:31.133461801 +0000 UTC m=+670.275059578" observedRunningTime="2026-01-27 11:31:32.088244094 +0000 UTC m=+671.229841911" watchObservedRunningTime="2026-01-27 11:31:32.088975464 +0000 UTC m=+671.230573261" Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.119823 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" podStartSLOduration=1.662898175 podStartE2EDuration="4.119805773s" podCreationTimestamp="2026-01-27 11:31:28 +0000 UTC" firstStartedPulling="2026-01-27 11:31:28.678972898 +0000 UTC m=+667.820570675" lastFinishedPulling="2026-01-27 11:31:31.135880496 +0000 UTC m=+670.277478273" observedRunningTime="2026-01-27 11:31:32.116243067 +0000 UTC m=+671.257840854" watchObservedRunningTime="2026-01-27 11:31:32.119805773 +0000 UTC m=+671.261403550" Jan 27 11:31:34 crc kubenswrapper[4775]: I0127 11:31:34.069800 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" event={"ID":"4c84a5ec-b41d-4396-adea-3c9964cc7c59","Type":"ContainerStarted","Data":"d4823ca5709ba44722036bfeb628dd796de39c293301d0ccbfb15c732c75c316"} Jan 27 11:31:34 crc kubenswrapper[4775]: I0127 11:31:34.092251 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" podStartSLOduration=1.331951986 podStartE2EDuration="6.092226272s" podCreationTimestamp="2026-01-27 11:31:28 +0000 UTC" firstStartedPulling="2026-01-27 11:31:28.771891618 +0000 UTC m=+667.913489395" lastFinishedPulling="2026-01-27 11:31:33.532165864 +0000 UTC m=+672.673763681" observedRunningTime="2026-01-27 11:31:34.082939189 +0000 UTC m=+673.224537006" watchObservedRunningTime="2026-01-27 11:31:34.092226272 +0000 UTC m=+673.233824079" Jan 27 11:31:38 crc kubenswrapper[4775]: I0127 11:31:38.438174 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:38 crc kubenswrapper[4775]: I0127 11:31:38.738239 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:38 crc kubenswrapper[4775]: I0127 11:31:38.738313 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:38 crc kubenswrapper[4775]: I0127 11:31:38.745866 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:39 crc kubenswrapper[4775]: I0127 11:31:39.107752 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:39 crc kubenswrapper[4775]: I0127 11:31:39.191004 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:31:48 crc kubenswrapper[4775]: I0127 11:31:48.973068 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.078865 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk"] Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.081153 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.083895 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.097281 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk"] Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.179490 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgrtw\" (UniqueName: \"kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.179586 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.179756 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.280698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.280827 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.280873 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgrtw\" (UniqueName: \"kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.281321 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.281569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.306018 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgrtw\" (UniqueName: \"kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.405078 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.689248 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk"] Jan 27 11:32:03 crc kubenswrapper[4775]: W0127 11:32:03.700920 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99ed53a2_63f4_4636_b581_2a686d44d5d0.slice/crio-76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846 WatchSource:0}: Error finding container 76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846: Status 404 returned error can't find the container with id 76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846 Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.264313 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hj8rf" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerName="console" containerID="cri-o://94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a" gracePeriod=15 Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.273527 4775 generic.go:334] "Generic (PLEG): container finished" podID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerID="b3c1106a62535249c344a2ba38dd2af2783f4df62b77cd7cef2cb50afd049328" exitCode=0 Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.273579 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" event={"ID":"99ed53a2-63f4-4636-b581-2a686d44d5d0","Type":"ContainerDied","Data":"b3c1106a62535249c344a2ba38dd2af2783f4df62b77cd7cef2cb50afd049328"} Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.273613 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" event={"ID":"99ed53a2-63f4-4636-b581-2a686d44d5d0","Type":"ContainerStarted","Data":"76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846"} Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.621388 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hj8rf_ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf/console/0.log" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.621492 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697596 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697650 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvbnl\" (UniqueName: \"kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697682 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697720 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697757 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697781 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697834 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.698808 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config" (OuterVolumeSpecName: "console-config") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.698827 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.698837 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.698913 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca" (OuterVolumeSpecName: "service-ca") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.704102 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.704180 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.704495 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl" (OuterVolumeSpecName: "kube-api-access-qvbnl") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "kube-api-access-qvbnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799429 4775 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799504 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvbnl\" (UniqueName: \"kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799522 4775 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799534 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799545 4775 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799557 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799596 4775 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.284883 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hj8rf_ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf/console/0.log" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.285883 4775 generic.go:334] "Generic (PLEG): container finished" podID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerID="94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a" exitCode=2 Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.285992 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hj8rf" event={"ID":"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf","Type":"ContainerDied","Data":"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a"} Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.286323 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hj8rf" event={"ID":"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf","Type":"ContainerDied","Data":"152d04ae80ec3e4ea65562160c2d55c0e2688c495a74f3bc1b1fca916b3879fa"} Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.286025 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.286390 4775 scope.go:117] "RemoveContainer" containerID="94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.314198 4775 scope.go:117] "RemoveContainer" containerID="94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a" Jan 27 11:32:05 crc kubenswrapper[4775]: E0127 11:32:05.314994 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a\": container with ID starting with 94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a not found: ID does not exist" containerID="94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.315109 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a"} err="failed to get container status \"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a\": rpc error: code = NotFound desc = could not find container \"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a\": container with ID starting with 94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a not found: ID does not exist" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.332572 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.336503 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.750840 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" path="/var/lib/kubelet/pods/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf/volumes" Jan 27 11:32:06 crc kubenswrapper[4775]: I0127 11:32:06.294260 4775 generic.go:334] "Generic (PLEG): container finished" podID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerID="f69691451111e14034d367b659566622a72253576f1d06403550dc4371afa6fa" exitCode=0 Jan 27 11:32:06 crc kubenswrapper[4775]: I0127 11:32:06.294305 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" event={"ID":"99ed53a2-63f4-4636-b581-2a686d44d5d0","Type":"ContainerDied","Data":"f69691451111e14034d367b659566622a72253576f1d06403550dc4371afa6fa"} Jan 27 11:32:07 crc kubenswrapper[4775]: I0127 11:32:07.317801 4775 generic.go:334] "Generic (PLEG): container finished" podID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerID="339bbc62a1aae1c202c0dc66cd40de2f500ad662feae5351cd8fac675c93837e" exitCode=0 Jan 27 11:32:07 crc kubenswrapper[4775]: I0127 11:32:07.317899 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" event={"ID":"99ed53a2-63f4-4636-b581-2a686d44d5d0","Type":"ContainerDied","Data":"339bbc62a1aae1c202c0dc66cd40de2f500ad662feae5351cd8fac675c93837e"} Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.626215 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.758244 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgrtw\" (UniqueName: \"kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw\") pod \"99ed53a2-63f4-4636-b581-2a686d44d5d0\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.758319 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle\") pod \"99ed53a2-63f4-4636-b581-2a686d44d5d0\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.758338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util\") pod \"99ed53a2-63f4-4636-b581-2a686d44d5d0\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.760504 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle" (OuterVolumeSpecName: "bundle") pod "99ed53a2-63f4-4636-b581-2a686d44d5d0" (UID: "99ed53a2-63f4-4636-b581-2a686d44d5d0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.764931 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw" (OuterVolumeSpecName: "kube-api-access-cgrtw") pod "99ed53a2-63f4-4636-b581-2a686d44d5d0" (UID: "99ed53a2-63f4-4636-b581-2a686d44d5d0"). InnerVolumeSpecName "kube-api-access-cgrtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.772033 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util" (OuterVolumeSpecName: "util") pod "99ed53a2-63f4-4636-b581-2a686d44d5d0" (UID: "99ed53a2-63f4-4636-b581-2a686d44d5d0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.860333 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgrtw\" (UniqueName: \"kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.860386 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.860446 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:09 crc kubenswrapper[4775]: I0127 11:32:09.335583 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" event={"ID":"99ed53a2-63f4-4636-b581-2a686d44d5d0","Type":"ContainerDied","Data":"76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846"} Jan 27 11:32:09 crc kubenswrapper[4775]: I0127 11:32:09.335916 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846" Jan 27 11:32:09 crc kubenswrapper[4775]: I0127 11:32:09.335933 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.150685 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74"] Jan 27 11:32:18 crc kubenswrapper[4775]: E0127 11:32:18.151501 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="extract" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151516 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="extract" Jan 27 11:32:18 crc kubenswrapper[4775]: E0127 11:32:18.151525 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="pull" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151532 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="pull" Jan 27 11:32:18 crc kubenswrapper[4775]: E0127 11:32:18.151546 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerName="console" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151554 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerName="console" Jan 27 11:32:18 crc kubenswrapper[4775]: E0127 11:32:18.151568 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="util" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151576 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="util" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151687 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerName="console" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151700 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="extract" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.152153 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.155511 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.155782 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fptrf" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.155915 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.155973 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.166386 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.223009 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74"] Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.285165 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g568\" (UniqueName: \"kubernetes.io/projected/7560029a-575e-4d87-b4e8-4f090c5a7cd9-kube-api-access-7g568\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.285421 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-webhook-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.285505 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.386604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g568\" (UniqueName: \"kubernetes.io/projected/7560029a-575e-4d87-b4e8-4f090c5a7cd9-kube-api-access-7g568\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.386658 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-webhook-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.386699 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.392280 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.392349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-webhook-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.406490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g568\" (UniqueName: \"kubernetes.io/projected/7560029a-575e-4d87-b4e8-4f090c5a7cd9-kube-api-access-7g568\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.469579 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.490615 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966"] Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.491473 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.494754 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.494897 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.495176 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-q86kf" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.517634 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966"] Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.624630 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-apiservice-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.624998 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-webhook-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.625172 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swgk9\" (UniqueName: \"kubernetes.io/projected/acb19b04-4cd3-4304-a572-d25d4aa2932b-kube-api-access-swgk9\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.726876 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swgk9\" (UniqueName: \"kubernetes.io/projected/acb19b04-4cd3-4304-a572-d25d4aa2932b-kube-api-access-swgk9\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.726964 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-apiservice-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.726987 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-webhook-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.735550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-apiservice-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.756262 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swgk9\" (UniqueName: \"kubernetes.io/projected/acb19b04-4cd3-4304-a572-d25d4aa2932b-kube-api-access-swgk9\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.758323 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-webhook-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.780808 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74"] Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.846023 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:19 crc kubenswrapper[4775]: I0127 11:32:19.078666 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966"] Jan 27 11:32:19 crc kubenswrapper[4775]: W0127 11:32:19.087411 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacb19b04_4cd3_4304_a572_d25d4aa2932b.slice/crio-34e54cae0282d1302a27c32f940adadd39a371221d672332b3caedbdda89e9b2 WatchSource:0}: Error finding container 34e54cae0282d1302a27c32f940adadd39a371221d672332b3caedbdda89e9b2: Status 404 returned error can't find the container with id 34e54cae0282d1302a27c32f940adadd39a371221d672332b3caedbdda89e9b2 Jan 27 11:32:19 crc kubenswrapper[4775]: I0127 11:32:19.388895 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" event={"ID":"acb19b04-4cd3-4304-a572-d25d4aa2932b","Type":"ContainerStarted","Data":"34e54cae0282d1302a27c32f940adadd39a371221d672332b3caedbdda89e9b2"} Jan 27 11:32:19 crc kubenswrapper[4775]: I0127 11:32:19.389881 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" event={"ID":"7560029a-575e-4d87-b4e8-4f090c5a7cd9","Type":"ContainerStarted","Data":"72f6506066b09282f04c67a68caf2788d1fc79b1526106321ff8d1003ba93f77"} Jan 27 11:32:24 crc kubenswrapper[4775]: I0127 11:32:24.435662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" event={"ID":"7560029a-575e-4d87-b4e8-4f090c5a7cd9","Type":"ContainerStarted","Data":"adad9e29af6591f758ea555c02105819fdbdf11cb0d16c7b3575edbf92d4167f"} Jan 27 11:32:24 crc kubenswrapper[4775]: I0127 11:32:24.437162 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:24 crc kubenswrapper[4775]: I0127 11:32:24.445153 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" event={"ID":"acb19b04-4cd3-4304-a572-d25d4aa2932b","Type":"ContainerStarted","Data":"85eae189b415d15a57f4e923c259603f2faba9a718a1f3dc47c94c2d133dd878"} Jan 27 11:32:24 crc kubenswrapper[4775]: I0127 11:32:24.445908 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:24 crc kubenswrapper[4775]: I0127 11:32:24.466720 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" podStartSLOduration=1.86886559 podStartE2EDuration="6.466697738s" podCreationTimestamp="2026-01-27 11:32:18 +0000 UTC" firstStartedPulling="2026-01-27 11:32:18.789994365 +0000 UTC m=+717.931592142" lastFinishedPulling="2026-01-27 11:32:23.387826513 +0000 UTC m=+722.529424290" observedRunningTime="2026-01-27 11:32:24.461756393 +0000 UTC m=+723.603354170" watchObservedRunningTime="2026-01-27 11:32:24.466697738 +0000 UTC m=+723.608295515" Jan 27 11:32:29 crc kubenswrapper[4775]: I0127 11:32:29.518260 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:32:29 crc kubenswrapper[4775]: I0127 11:32:29.518604 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:32:38 crc kubenswrapper[4775]: I0127 11:32:38.876274 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:38 crc kubenswrapper[4775]: I0127 11:32:38.899374 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" podStartSLOduration=16.527767836 podStartE2EDuration="20.899352722s" podCreationTimestamp="2026-01-27 11:32:18 +0000 UTC" firstStartedPulling="2026-01-27 11:32:19.091303192 +0000 UTC m=+718.232900969" lastFinishedPulling="2026-01-27 11:32:23.462888088 +0000 UTC m=+722.604485855" observedRunningTime="2026-01-27 11:32:24.479373783 +0000 UTC m=+723.620971560" watchObservedRunningTime="2026-01-27 11:32:38.899352722 +0000 UTC m=+738.040950499" Jan 27 11:32:58 crc kubenswrapper[4775]: I0127 11:32:58.472290 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.214953 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.215940 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.219373 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tbqt8" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.227817 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-52txr"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.230027 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.230061 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.233924 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.234137 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.234614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.286582 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qm9dq"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.287403 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.289066 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wdvrg" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.289588 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.289727 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.289731 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.308030 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-4tjsf"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.309043 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.310396 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.312700 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-4tjsf"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351603 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metrics-certs\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351644 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-sockets\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351675 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metallb-excludel2\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351698 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-startup\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351721 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351738 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swgq6\" (UniqueName: \"kubernetes.io/projected/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-kube-api-access-swgq6\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351761 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz8mm\" (UniqueName: \"kubernetes.io/projected/5573a041-6f7e-4c23-b2ea-42de01c96cdd-kube-api-access-qz8mm\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351778 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-reloader\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351937 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.352025 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-conf\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.352077 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.352132 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.352164 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv2hg\" (UniqueName: \"kubernetes.io/projected/ac3b8043-04c7-4036-9dc5-6068d914356c-kube-api-access-fv2hg\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv2hg\" (UniqueName: \"kubernetes.io/projected/ac3b8043-04c7-4036-9dc5-6068d914356c-kube-api-access-fv2hg\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453085 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metrics-certs\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453104 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-sockets\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453131 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metallb-excludel2\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453162 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-cert\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453180 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-startup\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453199 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnmwc\" (UniqueName: \"kubernetes.io/projected/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-kube-api-access-cnmwc\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453218 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swgq6\" (UniqueName: \"kubernetes.io/projected/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-kube-api-access-swgq6\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453250 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-metrics-certs\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453266 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz8mm\" (UniqueName: \"kubernetes.io/projected/5573a041-6f7e-4c23-b2ea-42de01c96cdd-kube-api-access-qz8mm\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-reloader\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453326 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-conf\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453348 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453369 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.453492 4775 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.453551 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist podName:5573a041-6f7e-4c23-b2ea-42de01c96cdd nodeName:}" failed. No retries permitted until 2026-01-27 11:32:59.953527194 +0000 UTC m=+759.095124971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist") pod "speaker-qm9dq" (UID: "5573a041-6f7e-4c23-b2ea-42de01c96cdd") : secret "metallb-memberlist" not found Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.454753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-sockets\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.454781 4775 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.454851 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert podName:de8a1d9c-9c8b-4200-92ae-b82c65b24d56 nodeName:}" failed. No retries permitted until 2026-01-27 11:32:59.95483045 +0000 UTC m=+759.096428227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert") pod "frr-k8s-webhook-server-7df86c4f6c-ht6jz" (UID: "de8a1d9c-9c8b-4200-92ae-b82c65b24d56") : secret "frr-k8s-webhook-server-cert" not found Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.454870 4775 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.454913 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs podName:ac3b8043-04c7-4036-9dc5-6068d914356c nodeName:}" failed. No retries permitted until 2026-01-27 11:32:59.954901822 +0000 UTC m=+759.096499599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs") pod "frr-k8s-52txr" (UID: "ac3b8043-04c7-4036-9dc5-6068d914356c") : secret "frr-k8s-certs-secret" not found Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.455077 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-reloader\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.455233 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-conf\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.455293 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.455387 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-startup\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.455547 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metallb-excludel2\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.460023 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metrics-certs\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.472069 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swgq6\" (UniqueName: \"kubernetes.io/projected/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-kube-api-access-swgq6\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.476862 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz8mm\" (UniqueName: \"kubernetes.io/projected/5573a041-6f7e-4c23-b2ea-42de01c96cdd-kube-api-access-qz8mm\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.479797 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv2hg\" (UniqueName: \"kubernetes.io/projected/ac3b8043-04c7-4036-9dc5-6068d914356c-kube-api-access-fv2hg\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.518054 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.518101 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.554551 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnmwc\" (UniqueName: \"kubernetes.io/projected/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-kube-api-access-cnmwc\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.554641 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-metrics-certs\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.554772 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-cert\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.557521 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.559194 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-metrics-certs\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.568414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-cert\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.594666 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnmwc\" (UniqueName: \"kubernetes.io/projected/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-kube-api-access-cnmwc\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.625733 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.815697 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-4tjsf"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.959580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.959639 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.959716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.960522 4775 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.960646 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist podName:5573a041-6f7e-4c23-b2ea-42de01c96cdd nodeName:}" failed. No retries permitted until 2026-01-27 11:33:00.960605855 +0000 UTC m=+760.102203632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist") pod "speaker-qm9dq" (UID: "5573a041-6f7e-4c23-b2ea-42de01c96cdd") : secret "metallb-memberlist" not found Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.965064 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.967368 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.145998 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.154069 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-52txr" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.366831 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz"] Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.391558 4775 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.647463 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" event={"ID":"de8a1d9c-9c8b-4200-92ae-b82c65b24d56","Type":"ContainerStarted","Data":"3835776c46ca061097373c41c47490955b2540c241016c1909a6ce5df5616661"} Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.648347 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"7d56a7db9b8bdfab77cc2b371062405f4969a8cdfdb20859159f209b38363b5c"} Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.649897 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-4tjsf" event={"ID":"6bd75754-cf96-4b57-bfd3-711aa3dc06e6","Type":"ContainerStarted","Data":"b3e240f6e2869ea9daf2581758b9fbce1caac2aa2a41f5fa2f06964c4278406b"} Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.649942 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-4tjsf" event={"ID":"6bd75754-cf96-4b57-bfd3-711aa3dc06e6","Type":"ContainerStarted","Data":"23dbcd1d23d30655f6b5395a6550e1b378527870653d8bd7e9162404a9c0b28d"} Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.649953 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-4tjsf" event={"ID":"6bd75754-cf96-4b57-bfd3-711aa3dc06e6","Type":"ContainerStarted","Data":"ee3e6815c2b9377345d90b66b56c0f99e0acd07813c70a75afafefa19d248586"} Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.650069 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.674291 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-4tjsf" podStartSLOduration=1.6742714429999999 podStartE2EDuration="1.674271443s" podCreationTimestamp="2026-01-27 11:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:33:00.669933175 +0000 UTC m=+759.811530952" watchObservedRunningTime="2026-01-27 11:33:00.674271443 +0000 UTC m=+759.815869220" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.972339 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.986248 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:33:01 crc kubenswrapper[4775]: I0127 11:33:01.101582 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qm9dq" Jan 27 11:33:01 crc kubenswrapper[4775]: W0127 11:33:01.157049 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5573a041_6f7e_4c23_b2ea_42de01c96cdd.slice/crio-90905a8c6ac92e742ebbf2425bac13323277f059c7d18b6d6c9bccc306b12567 WatchSource:0}: Error finding container 90905a8c6ac92e742ebbf2425bac13323277f059c7d18b6d6c9bccc306b12567: Status 404 returned error can't find the container with id 90905a8c6ac92e742ebbf2425bac13323277f059c7d18b6d6c9bccc306b12567 Jan 27 11:33:01 crc kubenswrapper[4775]: I0127 11:33:01.659157 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qm9dq" event={"ID":"5573a041-6f7e-4c23-b2ea-42de01c96cdd","Type":"ContainerStarted","Data":"3c9e971f1c524bbd200da1957ca5f480fa8d28a840f5e1dcf956e8b53e340463"} Jan 27 11:33:01 crc kubenswrapper[4775]: I0127 11:33:01.659474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qm9dq" event={"ID":"5573a041-6f7e-4c23-b2ea-42de01c96cdd","Type":"ContainerStarted","Data":"90905a8c6ac92e742ebbf2425bac13323277f059c7d18b6d6c9bccc306b12567"} Jan 27 11:33:02 crc kubenswrapper[4775]: I0127 11:33:02.677151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qm9dq" event={"ID":"5573a041-6f7e-4c23-b2ea-42de01c96cdd","Type":"ContainerStarted","Data":"0c371d696aeaa53d0296327cd7c9b2c25fe3b3f085e206b979eb205dbf6d192e"} Jan 27 11:33:02 crc kubenswrapper[4775]: I0127 11:33:02.677293 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qm9dq" Jan 27 11:33:02 crc kubenswrapper[4775]: I0127 11:33:02.698109 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qm9dq" podStartSLOduration=3.698088243 podStartE2EDuration="3.698088243s" podCreationTimestamp="2026-01-27 11:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:33:02.696361267 +0000 UTC m=+761.837959054" watchObservedRunningTime="2026-01-27 11:33:02.698088243 +0000 UTC m=+761.839686020" Jan 27 11:33:07 crc kubenswrapper[4775]: I0127 11:33:07.720219 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" event={"ID":"de8a1d9c-9c8b-4200-92ae-b82c65b24d56","Type":"ContainerStarted","Data":"c170a171d6cf9bc9c8588394f975859d58cad8b1bf82fd83c02fad87aed36ace"} Jan 27 11:33:07 crc kubenswrapper[4775]: I0127 11:33:07.720878 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:33:07 crc kubenswrapper[4775]: I0127 11:33:07.722229 4775 generic.go:334] "Generic (PLEG): container finished" podID="ac3b8043-04c7-4036-9dc5-6068d914356c" containerID="64be078422524f86eb16ffe658bb47bef69ebfd330b5afa90579cb1d29df9506" exitCode=0 Jan 27 11:33:07 crc kubenswrapper[4775]: I0127 11:33:07.722294 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerDied","Data":"64be078422524f86eb16ffe658bb47bef69ebfd330b5afa90579cb1d29df9506"} Jan 27 11:33:07 crc kubenswrapper[4775]: I0127 11:33:07.745868 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" podStartSLOduration=1.720765268 podStartE2EDuration="8.745849696s" podCreationTimestamp="2026-01-27 11:32:59 +0000 UTC" firstStartedPulling="2026-01-27 11:33:00.374178959 +0000 UTC m=+759.515776736" lastFinishedPulling="2026-01-27 11:33:07.399263367 +0000 UTC m=+766.540861164" observedRunningTime="2026-01-27 11:33:07.744917171 +0000 UTC m=+766.886514968" watchObservedRunningTime="2026-01-27 11:33:07.745849696 +0000 UTC m=+766.887447473" Jan 27 11:33:08 crc kubenswrapper[4775]: I0127 11:33:08.731731 4775 generic.go:334] "Generic (PLEG): container finished" podID="ac3b8043-04c7-4036-9dc5-6068d914356c" containerID="99670cfcfebfec066858b73a8de9b5d78b915baeb44fecb97e2434c4b60cf99f" exitCode=0 Jan 27 11:33:08 crc kubenswrapper[4775]: I0127 11:33:08.731795 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerDied","Data":"99670cfcfebfec066858b73a8de9b5d78b915baeb44fecb97e2434c4b60cf99f"} Jan 27 11:33:09 crc kubenswrapper[4775]: I0127 11:33:09.630237 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:33:09 crc kubenswrapper[4775]: I0127 11:33:09.740678 4775 generic.go:334] "Generic (PLEG): container finished" podID="ac3b8043-04c7-4036-9dc5-6068d914356c" containerID="d40b5fef1225b0a1cebb8e29b3c22bf788afccb31b8eae58ec7b61a1aa769377" exitCode=0 Jan 27 11:33:09 crc kubenswrapper[4775]: I0127 11:33:09.740741 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerDied","Data":"d40b5fef1225b0a1cebb8e29b3c22bf788afccb31b8eae58ec7b61a1aa769377"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.749929 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"9b1bb7b622b0338596a54aa140d2e3bdaec2cd508f17f9a25c3c35c890201291"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750276 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-52txr" Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"129f9cb7db24ef4a502b80e9fe53a32bc180e3ee8cce85a61d023252411b02e2"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750304 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"ab4938dd6124bfe745b22865b226f2933fcab80254ba4d55a4e5374c49b45c26"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750314 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"ab4f112b75b075a1e0f42c35fd3faf0cbac80210ac42b5d70fe68cdef4bb8f06"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750324 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"84fa7fb410daaa73523d8f0ceaa0a5a5b1a69aca31b1824f97cceb86ddbd2a94"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750333 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"e54609e76012448cc9b14400734594a7b3db675e82a2974443932e69d30c8599"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.779637 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-52txr" podStartSLOduration=4.657806273 podStartE2EDuration="11.779621715s" podCreationTimestamp="2026-01-27 11:32:59 +0000 UTC" firstStartedPulling="2026-01-27 11:33:00.296680199 +0000 UTC m=+759.438277966" lastFinishedPulling="2026-01-27 11:33:07.418495631 +0000 UTC m=+766.560093408" observedRunningTime="2026-01-27 11:33:10.774934248 +0000 UTC m=+769.916532035" watchObservedRunningTime="2026-01-27 11:33:10.779621715 +0000 UTC m=+769.921219492" Jan 27 11:33:11 crc kubenswrapper[4775]: I0127 11:33:11.106019 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qm9dq" Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.861686 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.862885 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.865424 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9zhfn" Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.865530 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.865656 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.877280 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.949245 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftr7f\" (UniqueName: \"kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f\") pod \"openstack-operator-index-wlhh5\" (UID: \"024a719c-d757-41d5-b790-fc3c75d0b4ee\") " pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:14 crc kubenswrapper[4775]: I0127 11:33:14.050719 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftr7f\" (UniqueName: \"kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f\") pod \"openstack-operator-index-wlhh5\" (UID: \"024a719c-d757-41d5-b790-fc3c75d0b4ee\") " pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:14 crc kubenswrapper[4775]: I0127 11:33:14.067582 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftr7f\" (UniqueName: \"kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f\") pod \"openstack-operator-index-wlhh5\" (UID: \"024a719c-d757-41d5-b790-fc3c75d0b4ee\") " pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:14 crc kubenswrapper[4775]: I0127 11:33:14.210799 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:14 crc kubenswrapper[4775]: I0127 11:33:14.639734 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:14 crc kubenswrapper[4775]: W0127 11:33:14.648975 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024a719c_d757_41d5_b790_fc3c75d0b4ee.slice/crio-ac9957f2ff3ed0ec5d0f41438efc2afaa4f632e316459a08f98cdbb50bc737c3 WatchSource:0}: Error finding container ac9957f2ff3ed0ec5d0f41438efc2afaa4f632e316459a08f98cdbb50bc737c3: Status 404 returned error can't find the container with id ac9957f2ff3ed0ec5d0f41438efc2afaa4f632e316459a08f98cdbb50bc737c3 Jan 27 11:33:14 crc kubenswrapper[4775]: I0127 11:33:14.790757 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wlhh5" event={"ID":"024a719c-d757-41d5-b790-fc3c75d0b4ee","Type":"ContainerStarted","Data":"ac9957f2ff3ed0ec5d0f41438efc2afaa4f632e316459a08f98cdbb50bc737c3"} Jan 27 11:33:15 crc kubenswrapper[4775]: I0127 11:33:15.154742 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-52txr" Jan 27 11:33:15 crc kubenswrapper[4775]: I0127 11:33:15.193293 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-52txr" Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.242492 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.815434 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wlhh5" event={"ID":"024a719c-d757-41d5-b790-fc3c75d0b4ee","Type":"ContainerStarted","Data":"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95"} Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.837422 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wlhh5" podStartSLOduration=2.794700637 podStartE2EDuration="4.837394363s" podCreationTimestamp="2026-01-27 11:33:13 +0000 UTC" firstStartedPulling="2026-01-27 11:33:14.651033938 +0000 UTC m=+773.792631755" lastFinishedPulling="2026-01-27 11:33:16.693727704 +0000 UTC m=+775.835325481" observedRunningTime="2026-01-27 11:33:17.832742217 +0000 UTC m=+776.974340014" watchObservedRunningTime="2026-01-27 11:33:17.837394363 +0000 UTC m=+776.978992140" Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.854216 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-swjcb"] Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.855198 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.862605 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-swjcb"] Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.899987 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxmh\" (UniqueName: \"kubernetes.io/projected/56b44f0b-813c-4626-a8ec-54ac78bbb086-kube-api-access-8cxmh\") pod \"openstack-operator-index-swjcb\" (UID: \"56b44f0b-813c-4626-a8ec-54ac78bbb086\") " pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.002352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxmh\" (UniqueName: \"kubernetes.io/projected/56b44f0b-813c-4626-a8ec-54ac78bbb086-kube-api-access-8cxmh\") pod \"openstack-operator-index-swjcb\" (UID: \"56b44f0b-813c-4626-a8ec-54ac78bbb086\") " pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.036716 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxmh\" (UniqueName: \"kubernetes.io/projected/56b44f0b-813c-4626-a8ec-54ac78bbb086-kube-api-access-8cxmh\") pod \"openstack-operator-index-swjcb\" (UID: \"56b44f0b-813c-4626-a8ec-54ac78bbb086\") " pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.186917 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.493938 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-swjcb"] Jan 27 11:33:18 crc kubenswrapper[4775]: W0127 11:33:18.500805 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b44f0b_813c_4626_a8ec_54ac78bbb086.slice/crio-0ef6cac96f3b0cd5c26c10df66c6a7d7ca41e1f5cf7f4e4825fcda675017eeec WatchSource:0}: Error finding container 0ef6cac96f3b0cd5c26c10df66c6a7d7ca41e1f5cf7f4e4825fcda675017eeec: Status 404 returned error can't find the container with id 0ef6cac96f3b0cd5c26c10df66c6a7d7ca41e1f5cf7f4e4825fcda675017eeec Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.822599 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swjcb" event={"ID":"56b44f0b-813c-4626-a8ec-54ac78bbb086","Type":"ContainerStarted","Data":"0ef6cac96f3b0cd5c26c10df66c6a7d7ca41e1f5cf7f4e4825fcda675017eeec"} Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.822771 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wlhh5" podUID="024a719c-d757-41d5-b790-fc3c75d0b4ee" containerName="registry-server" containerID="cri-o://d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95" gracePeriod=2 Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.285194 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.424934 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftr7f\" (UniqueName: \"kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f\") pod \"024a719c-d757-41d5-b790-fc3c75d0b4ee\" (UID: \"024a719c-d757-41d5-b790-fc3c75d0b4ee\") " Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.434079 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f" (OuterVolumeSpecName: "kube-api-access-ftr7f") pod "024a719c-d757-41d5-b790-fc3c75d0b4ee" (UID: "024a719c-d757-41d5-b790-fc3c75d0b4ee"). InnerVolumeSpecName "kube-api-access-ftr7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.527171 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftr7f\" (UniqueName: \"kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f\") on node \"crc\" DevicePath \"\"" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.832712 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swjcb" event={"ID":"56b44f0b-813c-4626-a8ec-54ac78bbb086","Type":"ContainerStarted","Data":"78412061b1d15aeddf791256458359d5cc017abc1289bc61fa8ae1a5e63d4ab4"} Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.834841 4775 generic.go:334] "Generic (PLEG): container finished" podID="024a719c-d757-41d5-b790-fc3c75d0b4ee" containerID="d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95" exitCode=0 Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.834875 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wlhh5" event={"ID":"024a719c-d757-41d5-b790-fc3c75d0b4ee","Type":"ContainerDied","Data":"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95"} Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.834895 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wlhh5" event={"ID":"024a719c-d757-41d5-b790-fc3c75d0b4ee","Type":"ContainerDied","Data":"ac9957f2ff3ed0ec5d0f41438efc2afaa4f632e316459a08f98cdbb50bc737c3"} Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.834913 4775 scope.go:117] "RemoveContainer" containerID="d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.835010 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.877599 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-swjcb" podStartSLOduration=2.496677256 podStartE2EDuration="2.877572251s" podCreationTimestamp="2026-01-27 11:33:17 +0000 UTC" firstStartedPulling="2026-01-27 11:33:18.505964112 +0000 UTC m=+777.647561889" lastFinishedPulling="2026-01-27 11:33:18.886859057 +0000 UTC m=+778.028456884" observedRunningTime="2026-01-27 11:33:19.855558001 +0000 UTC m=+778.997155808" watchObservedRunningTime="2026-01-27 11:33:19.877572251 +0000 UTC m=+779.019170068" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.882182 4775 scope.go:117] "RemoveContainer" containerID="d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95" Jan 27 11:33:19 crc kubenswrapper[4775]: E0127 11:33:19.882919 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95\": container with ID starting with d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95 not found: ID does not exist" containerID="d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.882993 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95"} err="failed to get container status \"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95\": rpc error: code = NotFound desc = could not find container \"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95\": container with ID starting with d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95 not found: ID does not exist" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.883758 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.893853 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:20 crc kubenswrapper[4775]: I0127 11:33:20.153091 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:33:20 crc kubenswrapper[4775]: I0127 11:33:20.157619 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-52txr" Jan 27 11:33:21 crc kubenswrapper[4775]: I0127 11:33:21.760808 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="024a719c-d757-41d5-b790-fc3c75d0b4ee" path="/var/lib/kubelet/pods/024a719c-d757-41d5-b790-fc3c75d0b4ee/volumes" Jan 27 11:33:28 crc kubenswrapper[4775]: I0127 11:33:28.187506 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:28 crc kubenswrapper[4775]: I0127 11:33:28.188145 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:28 crc kubenswrapper[4775]: I0127 11:33:28.234580 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:28 crc kubenswrapper[4775]: I0127 11:33:28.934294 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.517880 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.518752 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.518797 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.519291 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.519343 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad" gracePeriod=600 Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.909208 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad" exitCode=0 Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.909400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad"} Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.910129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310"} Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.910231 4775 scope.go:117] "RemoveContainer" containerID="b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.340049 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9"] Jan 27 11:33:36 crc kubenswrapper[4775]: E0127 11:33:36.340902 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="024a719c-d757-41d5-b790-fc3c75d0b4ee" containerName="registry-server" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.340919 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="024a719c-d757-41d5-b790-fc3c75d0b4ee" containerName="registry-server" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.341066 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="024a719c-d757-41d5-b790-fc3c75d0b4ee" containerName="registry-server" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.341965 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.345779 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gn5z5" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.366104 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9"] Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.480643 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.480733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.480835 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n75pv\" (UniqueName: \"kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.582028 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.582090 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.582153 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n75pv\" (UniqueName: \"kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.582703 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.582739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.608735 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n75pv\" (UniqueName: \"kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.670446 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:37 crc kubenswrapper[4775]: I0127 11:33:37.137260 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9"] Jan 27 11:33:37 crc kubenswrapper[4775]: I0127 11:33:37.986967 4775 generic.go:334] "Generic (PLEG): container finished" podID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerID="36c33e59210f8597590e58ad1640b9d860f65abc02a3ec940c173013312f6d4e" exitCode=0 Jan 27 11:33:37 crc kubenswrapper[4775]: I0127 11:33:37.987080 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" event={"ID":"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd","Type":"ContainerDied","Data":"36c33e59210f8597590e58ad1640b9d860f65abc02a3ec940c173013312f6d4e"} Jan 27 11:33:37 crc kubenswrapper[4775]: I0127 11:33:37.987333 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" event={"ID":"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd","Type":"ContainerStarted","Data":"20e7437ae880be1bf8b4b306976215c9dac277d8f8aed6df3b7f71a1162ab2b0"} Jan 27 11:33:38 crc kubenswrapper[4775]: I0127 11:33:38.994255 4775 generic.go:334] "Generic (PLEG): container finished" podID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerID="29316e19d76aade267caba578e797b4bd9ecc8d4d9e7f4f92d321dc5e0a535e5" exitCode=0 Jan 27 11:33:38 crc kubenswrapper[4775]: I0127 11:33:38.994492 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" event={"ID":"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd","Type":"ContainerDied","Data":"29316e19d76aade267caba578e797b4bd9ecc8d4d9e7f4f92d321dc5e0a535e5"} Jan 27 11:33:40 crc kubenswrapper[4775]: I0127 11:33:40.001990 4775 generic.go:334] "Generic (PLEG): container finished" podID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerID="3b517171e2f225019f41f9077a13977c7d266c910909e4f7a2dd8f129053e996" exitCode=0 Jan 27 11:33:40 crc kubenswrapper[4775]: I0127 11:33:40.002041 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" event={"ID":"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd","Type":"ContainerDied","Data":"3b517171e2f225019f41f9077a13977c7d266c910909e4f7a2dd8f129053e996"} Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.313595 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.450654 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n75pv\" (UniqueName: \"kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv\") pod \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.450747 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle\") pod \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.450867 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util\") pod \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.451531 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle" (OuterVolumeSpecName: "bundle") pod "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" (UID: "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.457026 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv" (OuterVolumeSpecName: "kube-api-access-n75pv") pod "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" (UID: "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd"). InnerVolumeSpecName "kube-api-access-n75pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.465114 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util" (OuterVolumeSpecName: "util") pod "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" (UID: "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.567955 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util\") on node \"crc\" DevicePath \"\"" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.568025 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n75pv\" (UniqueName: \"kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv\") on node \"crc\" DevicePath \"\"" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.568043 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:33:42 crc kubenswrapper[4775]: I0127 11:33:42.018960 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" event={"ID":"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd","Type":"ContainerDied","Data":"20e7437ae880be1bf8b4b306976215c9dac277d8f8aed6df3b7f71a1162ab2b0"} Jan 27 11:33:42 crc kubenswrapper[4775]: I0127 11:33:42.019388 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e7437ae880be1bf8b4b306976215c9dac277d8f8aed6df3b7f71a1162ab2b0" Jan 27 11:33:42 crc kubenswrapper[4775]: I0127 11:33:42.019343 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.991233 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8"] Jan 27 11:33:48 crc kubenswrapper[4775]: E0127 11:33:48.992047 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="util" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.992063 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="util" Jan 27 11:33:48 crc kubenswrapper[4775]: E0127 11:33:48.992078 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="pull" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.992085 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="pull" Jan 27 11:33:48 crc kubenswrapper[4775]: E0127 11:33:48.992104 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="extract" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.992114 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="extract" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.992246 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="extract" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.992748 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.994810 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-qhtfs" Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.020837 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8"] Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.088025 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mp8k\" (UniqueName: \"kubernetes.io/projected/8868fb89-f25b-48ef-b4e2-9acab9f78790-kube-api-access-8mp8k\") pod \"openstack-operator-controller-init-6bfcf7b875-z4vw8\" (UID: \"8868fb89-f25b-48ef-b4e2-9acab9f78790\") " pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.188851 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mp8k\" (UniqueName: \"kubernetes.io/projected/8868fb89-f25b-48ef-b4e2-9acab9f78790-kube-api-access-8mp8k\") pod \"openstack-operator-controller-init-6bfcf7b875-z4vw8\" (UID: \"8868fb89-f25b-48ef-b4e2-9acab9f78790\") " pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.215357 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mp8k\" (UniqueName: \"kubernetes.io/projected/8868fb89-f25b-48ef-b4e2-9acab9f78790-kube-api-access-8mp8k\") pod \"openstack-operator-controller-init-6bfcf7b875-z4vw8\" (UID: \"8868fb89-f25b-48ef-b4e2-9acab9f78790\") " pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.327698 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.863099 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8"] Jan 27 11:33:50 crc kubenswrapper[4775]: I0127 11:33:50.072180 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" event={"ID":"8868fb89-f25b-48ef-b4e2-9acab9f78790","Type":"ContainerStarted","Data":"fccdbb5cfc07cddd2b172c42a0dbc3411689c4628eae3d3bb86468e0faed9304"} Jan 27 11:33:55 crc kubenswrapper[4775]: I0127 11:33:55.106514 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" event={"ID":"8868fb89-f25b-48ef-b4e2-9acab9f78790","Type":"ContainerStarted","Data":"9902d0c5c5d528c83a34f60d79d942798887e3d652681eafbc131a6c7ceeb030"} Jan 27 11:33:55 crc kubenswrapper[4775]: I0127 11:33:55.107280 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:55 crc kubenswrapper[4775]: I0127 11:33:55.151938 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" podStartSLOduration=2.908720325 podStartE2EDuration="7.151915575s" podCreationTimestamp="2026-01-27 11:33:48 +0000 UTC" firstStartedPulling="2026-01-27 11:33:49.880148091 +0000 UTC m=+809.021745868" lastFinishedPulling="2026-01-27 11:33:54.123343341 +0000 UTC m=+813.264941118" observedRunningTime="2026-01-27 11:33:55.135371545 +0000 UTC m=+814.276969322" watchObservedRunningTime="2026-01-27 11:33:55.151915575 +0000 UTC m=+814.293513352" Jan 27 11:33:59 crc kubenswrapper[4775]: I0127 11:33:59.330617 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.003456 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.004545 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.007115 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qdzw9" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.014302 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.015286 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.016963 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-vn2x4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.018122 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.041891 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.046352 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.047069 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.061529 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-bpwv8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.097349 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.098075 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.102729 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-hr54n" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.103526 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.104715 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.106786 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-dzxkr" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.121853 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.146780 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.150088 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nr5q\" (UniqueName: \"kubernetes.io/projected/c31d5b06-1ad2-4914-96c1-e0f0b8c4974e-kube-api-access-6nr5q\") pod \"designate-operator-controller-manager-76d4d5b8f9-dvj9s\" (UID: \"c31d5b06-1ad2-4914-96c1-e0f0b8c4974e\") " pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.150160 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-524xg\" (UniqueName: \"kubernetes.io/projected/04cbcc0c-4375-44f0-9461-b43492e9d95b-kube-api-access-524xg\") pod \"barbican-operator-controller-manager-75b8f798ff-t29z2\" (UID: \"04cbcc0c-4375-44f0-9461-b43492e9d95b\") " pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.150185 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9v4\" (UniqueName: \"kubernetes.io/projected/dd9264fb-034f-46d3-8698-dcc6fc3470f6-kube-api-access-7t9v4\") pod \"heat-operator-controller-manager-658dd65b86-jp5c7\" (UID: \"dd9264fb-034f-46d3-8698-dcc6fc3470f6\") " pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.150216 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdxk\" (UniqueName: \"kubernetes.io/projected/f04fa2a0-7af2-439a-9169-6edf5be65b35-kube-api-access-wrdxk\") pod \"cinder-operator-controller-manager-5fdc687f5-9wc4j\" (UID: \"f04fa2a0-7af2-439a-9169-6edf5be65b35\") " pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.157328 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.166032 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.167048 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.185959 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-j58ml" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.222975 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.223948 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.230571 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-slgvx" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.246608 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.247424 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.248926 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.249330 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5sbqs" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.251799 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdxk\" (UniqueName: \"kubernetes.io/projected/f04fa2a0-7af2-439a-9169-6edf5be65b35-kube-api-access-wrdxk\") pod \"cinder-operator-controller-manager-5fdc687f5-9wc4j\" (UID: \"f04fa2a0-7af2-439a-9169-6edf5be65b35\") " pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.251882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gr7f\" (UniqueName: \"kubernetes.io/projected/703a739a-6687-4324-b937-7d0efe7c143b-kube-api-access-2gr7f\") pod \"horizon-operator-controller-manager-7f5ddd8d7b-58qnd\" (UID: \"703a739a-6687-4324-b937-7d0efe7c143b\") " pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.251921 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nr5q\" (UniqueName: \"kubernetes.io/projected/c31d5b06-1ad2-4914-96c1-e0f0b8c4974e-kube-api-access-6nr5q\") pod \"designate-operator-controller-manager-76d4d5b8f9-dvj9s\" (UID: \"c31d5b06-1ad2-4914-96c1-e0f0b8c4974e\") " pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.251983 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdtl\" (UniqueName: \"kubernetes.io/projected/0cabb338-c4a1-41b4-abd6-d535b0e88406-kube-api-access-vgdtl\") pod \"glance-operator-controller-manager-84d5bb46b-cvp5b\" (UID: \"0cabb338-c4a1-41b4-abd6-d535b0e88406\") " pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.252022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-524xg\" (UniqueName: \"kubernetes.io/projected/04cbcc0c-4375-44f0-9461-b43492e9d95b-kube-api-access-524xg\") pod \"barbican-operator-controller-manager-75b8f798ff-t29z2\" (UID: \"04cbcc0c-4375-44f0-9461-b43492e9d95b\") " pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.252050 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9v4\" (UniqueName: \"kubernetes.io/projected/dd9264fb-034f-46d3-8698-dcc6fc3470f6-kube-api-access-7t9v4\") pod \"heat-operator-controller-manager-658dd65b86-jp5c7\" (UID: \"dd9264fb-034f-46d3-8698-dcc6fc3470f6\") " pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.256160 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.277773 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.279098 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.280971 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-5dfsq" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.282067 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9v4\" (UniqueName: \"kubernetes.io/projected/dd9264fb-034f-46d3-8698-dcc6fc3470f6-kube-api-access-7t9v4\") pod \"heat-operator-controller-manager-658dd65b86-jp5c7\" (UID: \"dd9264fb-034f-46d3-8698-dcc6fc3470f6\") " pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.284075 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdxk\" (UniqueName: \"kubernetes.io/projected/f04fa2a0-7af2-439a-9169-6edf5be65b35-kube-api-access-wrdxk\") pod \"cinder-operator-controller-manager-5fdc687f5-9wc4j\" (UID: \"f04fa2a0-7af2-439a-9169-6edf5be65b35\") " pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.293763 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nr5q\" (UniqueName: \"kubernetes.io/projected/c31d5b06-1ad2-4914-96c1-e0f0b8c4974e-kube-api-access-6nr5q\") pod \"designate-operator-controller-manager-76d4d5b8f9-dvj9s\" (UID: \"c31d5b06-1ad2-4914-96c1-e0f0b8c4974e\") " pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.305155 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.335079 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-524xg\" (UniqueName: \"kubernetes.io/projected/04cbcc0c-4375-44f0-9461-b43492e9d95b-kube-api-access-524xg\") pod \"barbican-operator-controller-manager-75b8f798ff-t29z2\" (UID: \"04cbcc0c-4375-44f0-9461-b43492e9d95b\") " pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.345755 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.358779 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.358831 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw89t\" (UniqueName: \"kubernetes.io/projected/b296a3cd-1dc1-4511-af7a-7b1801e23e61-kube-api-access-sw89t\") pod \"ironic-operator-controller-manager-58865f87b4-s2l5z\" (UID: \"b296a3cd-1dc1-4511-af7a-7b1801e23e61\") " pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.358891 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gr7f\" (UniqueName: \"kubernetes.io/projected/703a739a-6687-4324-b937-7d0efe7c143b-kube-api-access-2gr7f\") pod \"horizon-operator-controller-manager-7f5ddd8d7b-58qnd\" (UID: \"703a739a-6687-4324-b937-7d0efe7c143b\") " pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.358929 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7rjf\" (UniqueName: \"kubernetes.io/projected/0da235e3-e76a-408f-8e0e-3cdd7ce76705-kube-api-access-q7rjf\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.358976 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdtl\" (UniqueName: \"kubernetes.io/projected/0cabb338-c4a1-41b4-abd6-d535b0e88406-kube-api-access-vgdtl\") pod \"glance-operator-controller-manager-84d5bb46b-cvp5b\" (UID: \"0cabb338-c4a1-41b4-abd6-d535b0e88406\") " pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.359002 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5djl\" (UniqueName: \"kubernetes.io/projected/4e719fbd-ac18-4ae1-bac6-c42f1e081daa-kube-api-access-k5djl\") pod \"keystone-operator-controller-manager-78f8b7b89c-2wqgg\" (UID: \"4e719fbd-ac18-4ae1-bac6-c42f1e081daa\") " pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.397833 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gr7f\" (UniqueName: \"kubernetes.io/projected/703a739a-6687-4324-b937-7d0efe7c143b-kube-api-access-2gr7f\") pod \"horizon-operator-controller-manager-7f5ddd8d7b-58qnd\" (UID: \"703a739a-6687-4324-b937-7d0efe7c143b\") " pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.398261 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.398606 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.401139 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdtl\" (UniqueName: \"kubernetes.io/projected/0cabb338-c4a1-41b4-abd6-d535b0e88406-kube-api-access-vgdtl\") pod \"glance-operator-controller-manager-84d5bb46b-cvp5b\" (UID: \"0cabb338-c4a1-41b4-abd6-d535b0e88406\") " pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.411589 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.412744 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.418222 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-69nmj" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.423716 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.434128 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.445343 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.446135 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.448520 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6d7xh" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.452062 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.455904 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.466071 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.466885 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7rjf\" (UniqueName: \"kubernetes.io/projected/0da235e3-e76a-408f-8e0e-3cdd7ce76705-kube-api-access-q7rjf\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.467010 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkbw7\" (UniqueName: \"kubernetes.io/projected/6c5084e4-b0e1-46fd-ae69-c0f2ede3db17-kube-api-access-fkbw7\") pod \"manila-operator-controller-manager-78b8f8fd84-8xrd7\" (UID: \"6c5084e4-b0e1-46fd-ae69-c0f2ede3db17\") " pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.467053 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5djl\" (UniqueName: \"kubernetes.io/projected/4e719fbd-ac18-4ae1-bac6-c42f1e081daa-kube-api-access-k5djl\") pod \"keystone-operator-controller-manager-78f8b7b89c-2wqgg\" (UID: \"4e719fbd-ac18-4ae1-bac6-c42f1e081daa\") " pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.467088 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.467152 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.467173 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw89t\" (UniqueName: \"kubernetes.io/projected/b296a3cd-1dc1-4511-af7a-7b1801e23e61-kube-api-access-sw89t\") pod \"ironic-operator-controller-manager-58865f87b4-s2l5z\" (UID: \"b296a3cd-1dc1-4511-af7a-7b1801e23e61\") " pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.467513 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.467567 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:37.967550461 +0000 UTC m=+857.109148238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.473498 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.477146 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-c9b2k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.481122 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.484617 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7rjf\" (UniqueName: \"kubernetes.io/projected/0da235e3-e76a-408f-8e0e-3cdd7ce76705-kube-api-access-q7rjf\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.485352 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5djl\" (UniqueName: \"kubernetes.io/projected/4e719fbd-ac18-4ae1-bac6-c42f1e081daa-kube-api-access-k5djl\") pod \"keystone-operator-controller-manager-78f8b7b89c-2wqgg\" (UID: \"4e719fbd-ac18-4ae1-bac6-c42f1e081daa\") " pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.485712 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.487146 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw89t\" (UniqueName: \"kubernetes.io/projected/b296a3cd-1dc1-4511-af7a-7b1801e23e61-kube-api-access-sw89t\") pod \"ironic-operator-controller-manager-58865f87b4-s2l5z\" (UID: \"b296a3cd-1dc1-4511-af7a-7b1801e23e61\") " pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.500768 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.501652 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.503216 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mghw4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.506089 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.514868 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.525097 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.525198 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.526075 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.527303 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-f4vht" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.529901 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.530669 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.535393 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2xknr" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.539275 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.541978 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.542831 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.545363 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.546333 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cfsqd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.546544 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.550368 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.551514 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-m7x54" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.555967 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.561551 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.563719 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.570274 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rhhx8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.571089 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2v5t\" (UniqueName: \"kubernetes.io/projected/2a55fa83-c395-4ac2-bc2e-355ad48a4a95-kube-api-access-v2v5t\") pod \"nova-operator-controller-manager-74ffd97575-cln8g\" (UID: \"2a55fa83-c395-4ac2-bc2e-355ad48a4a95\") " pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.571130 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkbw7\" (UniqueName: \"kubernetes.io/projected/6c5084e4-b0e1-46fd-ae69-c0f2ede3db17-kube-api-access-fkbw7\") pod \"manila-operator-controller-manager-78b8f8fd84-8xrd7\" (UID: \"6c5084e4-b0e1-46fd-ae69-c0f2ede3db17\") " pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.571173 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrhjl\" (UniqueName: \"kubernetes.io/projected/6bcdd59a-9739-40e7-9625-3e56009dcbd7-kube-api-access-wrhjl\") pod \"neutron-operator-controller-manager-569695f6c5-pmk9t\" (UID: \"6bcdd59a-9739-40e7-9625-3e56009dcbd7\") " pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.571231 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xts7d\" (UniqueName: \"kubernetes.io/projected/56fb2890-7d29-452c-9f24-4aa20d977f0b-kube-api-access-xts7d\") pod \"mariadb-operator-controller-manager-7b88bfc995-tzn2s\" (UID: \"56fb2890-7d29-452c-9f24-4aa20d977f0b\") " pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.574262 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.574517 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.583424 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.587730 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.588884 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.590881 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hwctc" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.593141 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.602077 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkbw7\" (UniqueName: \"kubernetes.io/projected/6c5084e4-b0e1-46fd-ae69-c0f2ede3db17-kube-api-access-fkbw7\") pod \"manila-operator-controller-manager-78b8f8fd84-8xrd7\" (UID: \"6c5084e4-b0e1-46fd-ae69-c0f2ede3db17\") " pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.615590 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.616664 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.620329 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-r6fhd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.621214 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.646979 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.647900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.652845 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dxp8c" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.676867 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.678160 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.678948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcj29\" (UniqueName: \"kubernetes.io/projected/01a03f23-ead5-4a15-976f-4dda2622083b-kube-api-access-dcj29\") pod \"telemetry-operator-controller-manager-7db57dc8bf-5lbbt\" (UID: \"01a03f23-ead5-4a15-976f-4dda2622083b\") " pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.678981 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gm2t\" (UniqueName: \"kubernetes.io/projected/3e47cb1c-7f01-4b8d-904f-fed543678a02-kube-api-access-6gm2t\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679017 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfq7s\" (UniqueName: \"kubernetes.io/projected/e14198f0-3413-4350-bae5-33b23ceead05-kube-api-access-lfq7s\") pod \"placement-operator-controller-manager-7748d79f84-vmtx4\" (UID: \"e14198f0-3413-4350-bae5-33b23ceead05\") " pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679051 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2v5t\" (UniqueName: \"kubernetes.io/projected/2a55fa83-c395-4ac2-bc2e-355ad48a4a95-kube-api-access-v2v5t\") pod \"nova-operator-controller-manager-74ffd97575-cln8g\" (UID: \"2a55fa83-c395-4ac2-bc2e-355ad48a4a95\") " pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679135 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5jn\" (UniqueName: \"kubernetes.io/projected/701902fe-7e51-44b6-923b-0a60c96d6707-kube-api-access-lj5jn\") pod \"ovn-operator-controller-manager-bf6d4f946-p9vts\" (UID: \"701902fe-7e51-44b6-923b-0a60c96d6707\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679172 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr8zg\" (UniqueName: \"kubernetes.io/projected/7df5397d-0c1f-46b4-8695-d80c752ca569-kube-api-access-dr8zg\") pod \"octavia-operator-controller-manager-7bf4858b78-fcd9x\" (UID: \"7df5397d-0c1f-46b4-8695-d80c752ca569\") " pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrhjl\" (UniqueName: \"kubernetes.io/projected/6bcdd59a-9739-40e7-9625-3e56009dcbd7-kube-api-access-wrhjl\") pod \"neutron-operator-controller-manager-569695f6c5-pmk9t\" (UID: \"6bcdd59a-9739-40e7-9625-3e56009dcbd7\") " pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679273 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjwcx\" (UniqueName: \"kubernetes.io/projected/909c9a87-2eb1-4a52-b86d-6d36524b1eb2-kube-api-access-gjwcx\") pod \"swift-operator-controller-manager-65596dbf77-9sfp8\" (UID: \"909c9a87-2eb1-4a52-b86d-6d36524b1eb2\") " pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679328 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zwgt\" (UniqueName: \"kubernetes.io/projected/5070c545-d4c0-46b3-afb9-c130dc982406-kube-api-access-7zwgt\") pod \"test-operator-controller-manager-6c866cfdcb-2mz97\" (UID: \"5070c545-d4c0-46b3-afb9-c130dc982406\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679359 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xts7d\" (UniqueName: \"kubernetes.io/projected/56fb2890-7d29-452c-9f24-4aa20d977f0b-kube-api-access-xts7d\") pod \"mariadb-operator-controller-manager-7b88bfc995-tzn2s\" (UID: \"56fb2890-7d29-452c-9f24-4aa20d977f0b\") " pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.731611 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.742048 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xts7d\" (UniqueName: \"kubernetes.io/projected/56fb2890-7d29-452c-9f24-4aa20d977f0b-kube-api-access-xts7d\") pod \"mariadb-operator-controller-manager-7b88bfc995-tzn2s\" (UID: \"56fb2890-7d29-452c-9f24-4aa20d977f0b\") " pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.748956 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrhjl\" (UniqueName: \"kubernetes.io/projected/6bcdd59a-9739-40e7-9625-3e56009dcbd7-kube-api-access-wrhjl\") pod \"neutron-operator-controller-manager-569695f6c5-pmk9t\" (UID: \"6bcdd59a-9739-40e7-9625-3e56009dcbd7\") " pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.756690 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2v5t\" (UniqueName: \"kubernetes.io/projected/2a55fa83-c395-4ac2-bc2e-355ad48a4a95-kube-api-access-v2v5t\") pod \"nova-operator-controller-manager-74ffd97575-cln8g\" (UID: \"2a55fa83-c395-4ac2-bc2e-355ad48a4a95\") " pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.772900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780203 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gm2t\" (UniqueName: \"kubernetes.io/projected/3e47cb1c-7f01-4b8d-904f-fed543678a02-kube-api-access-6gm2t\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780246 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfq7s\" (UniqueName: \"kubernetes.io/projected/e14198f0-3413-4350-bae5-33b23ceead05-kube-api-access-lfq7s\") pod \"placement-operator-controller-manager-7748d79f84-vmtx4\" (UID: \"e14198f0-3413-4350-bae5-33b23ceead05\") " pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780279 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5jn\" (UniqueName: \"kubernetes.io/projected/701902fe-7e51-44b6-923b-0a60c96d6707-kube-api-access-lj5jn\") pod \"ovn-operator-controller-manager-bf6d4f946-p9vts\" (UID: \"701902fe-7e51-44b6-923b-0a60c96d6707\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780300 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780322 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr8zg\" (UniqueName: \"kubernetes.io/projected/7df5397d-0c1f-46b4-8695-d80c752ca569-kube-api-access-dr8zg\") pod \"octavia-operator-controller-manager-7bf4858b78-fcd9x\" (UID: \"7df5397d-0c1f-46b4-8695-d80c752ca569\") " pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780350 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xv9n\" (UniqueName: \"kubernetes.io/projected/bea84175-0947-45e5-a635-b7d32a0442c6-kube-api-access-2xv9n\") pod \"watcher-operator-controller-manager-6476466c7c-lb4h8\" (UID: \"bea84175-0947-45e5-a635-b7d32a0442c6\") " pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780374 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjwcx\" (UniqueName: \"kubernetes.io/projected/909c9a87-2eb1-4a52-b86d-6d36524b1eb2-kube-api-access-gjwcx\") pod \"swift-operator-controller-manager-65596dbf77-9sfp8\" (UID: \"909c9a87-2eb1-4a52-b86d-6d36524b1eb2\") " pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780414 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zwgt\" (UniqueName: \"kubernetes.io/projected/5070c545-d4c0-46b3-afb9-c130dc982406-kube-api-access-7zwgt\") pod \"test-operator-controller-manager-6c866cfdcb-2mz97\" (UID: \"5070c545-d4c0-46b3-afb9-c130dc982406\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780444 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcj29\" (UniqueName: \"kubernetes.io/projected/01a03f23-ead5-4a15-976f-4dda2622083b-kube-api-access-dcj29\") pod \"telemetry-operator-controller-manager-7db57dc8bf-5lbbt\" (UID: \"01a03f23-ead5-4a15-976f-4dda2622083b\") " pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.780737 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.780806 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:38.280787765 +0000 UTC m=+857.422385542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.791811 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.799503 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5jn\" (UniqueName: \"kubernetes.io/projected/701902fe-7e51-44b6-923b-0a60c96d6707-kube-api-access-lj5jn\") pod \"ovn-operator-controller-manager-bf6d4f946-p9vts\" (UID: \"701902fe-7e51-44b6-923b-0a60c96d6707\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.801013 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjwcx\" (UniqueName: \"kubernetes.io/projected/909c9a87-2eb1-4a52-b86d-6d36524b1eb2-kube-api-access-gjwcx\") pod \"swift-operator-controller-manager-65596dbf77-9sfp8\" (UID: \"909c9a87-2eb1-4a52-b86d-6d36524b1eb2\") " pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.809285 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.811557 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zwgt\" (UniqueName: \"kubernetes.io/projected/5070c545-d4c0-46b3-afb9-c130dc982406-kube-api-access-7zwgt\") pod \"test-operator-controller-manager-6c866cfdcb-2mz97\" (UID: \"5070c545-d4c0-46b3-afb9-c130dc982406\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.815906 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcj29\" (UniqueName: \"kubernetes.io/projected/01a03f23-ead5-4a15-976f-4dda2622083b-kube-api-access-dcj29\") pod \"telemetry-operator-controller-manager-7db57dc8bf-5lbbt\" (UID: \"01a03f23-ead5-4a15-976f-4dda2622083b\") " pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.817530 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.818227 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.818241 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.818597 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr8zg\" (UniqueName: \"kubernetes.io/projected/7df5397d-0c1f-46b4-8695-d80c752ca569-kube-api-access-dr8zg\") pod \"octavia-operator-controller-manager-7bf4858b78-fcd9x\" (UID: \"7df5397d-0c1f-46b4-8695-d80c752ca569\") " pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.818931 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.818971 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.819041 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.822744 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-29t4c" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.822769 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-slbwg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.822835 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfq7s\" (UniqueName: \"kubernetes.io/projected/e14198f0-3413-4350-bae5-33b23ceead05-kube-api-access-lfq7s\") pod \"placement-operator-controller-manager-7748d79f84-vmtx4\" (UID: \"e14198f0-3413-4350-bae5-33b23ceead05\") " pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.822968 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.824599 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.829632 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gm2t\" (UniqueName: \"kubernetes.io/projected/3e47cb1c-7f01-4b8d-904f-fed543678a02-kube-api-access-6gm2t\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.830816 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.851865 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.852003 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.881418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xv9n\" (UniqueName: \"kubernetes.io/projected/bea84175-0947-45e5-a635-b7d32a0442c6-kube-api-access-2xv9n\") pod \"watcher-operator-controller-manager-6476466c7c-lb4h8\" (UID: \"bea84175-0947-45e5-a635-b7d32a0442c6\") " pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.881551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndrmm\" (UniqueName: \"kubernetes.io/projected/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-kube-api-access-ndrmm\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.881653 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knm5c\" (UniqueName: \"kubernetes.io/projected/a5e8d398-7976-4603-8409-304fa193f7f1-kube-api-access-knm5c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g5nsq\" (UID: \"a5e8d398-7976-4603-8409-304fa193f7f1\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.881706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.881775 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.904956 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xv9n\" (UniqueName: \"kubernetes.io/projected/bea84175-0947-45e5-a635-b7d32a0442c6-kube-api-access-2xv9n\") pod \"watcher-operator-controller-manager-6476466c7c-lb4h8\" (UID: \"bea84175-0947-45e5-a635-b7d32a0442c6\") " pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.926029 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.939917 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.979703 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990164 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990833 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndrmm\" (UniqueName: \"kubernetes.io/projected/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-kube-api-access-ndrmm\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990865 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knm5c\" (UniqueName: \"kubernetes.io/projected/a5e8d398-7976-4603-8409-304fa193f7f1-kube-api-access-knm5c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g5nsq\" (UID: \"a5e8d398-7976-4603-8409-304fa193f7f1\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990932 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990960 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.991306 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.991360 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:38.491345226 +0000 UTC m=+857.632943003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.992737 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.992788 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:38.492772855 +0000 UTC m=+857.634370632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.992830 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.992851 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:38.992844557 +0000 UTC m=+858.134442334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.011329 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndrmm\" (UniqueName: \"kubernetes.io/projected/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-kube-api-access-ndrmm\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.023356 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knm5c\" (UniqueName: \"kubernetes.io/projected/a5e8d398-7976-4603-8409-304fa193f7f1-kube-api-access-knm5c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g5nsq\" (UID: \"a5e8d398-7976-4603-8409-304fa193f7f1\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.192285 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.237664 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.240275 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.263052 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.271179 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.294147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.294723 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.294774 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:39.294760261 +0000 UTC m=+858.436358038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.475637 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" event={"ID":"f04fa2a0-7af2-439a-9169-6edf5be65b35","Type":"ContainerStarted","Data":"635cb70a23c2860502f533646fdb4d841561609d7d6d12757d03aea64d1f5c15"} Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.480543 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" event={"ID":"c31d5b06-1ad2-4914-96c1-e0f0b8c4974e","Type":"ContainerStarted","Data":"60e06bdfb4344d6653703814829342f16064f4e9f6ccc2ae239efee14eed5d21"} Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.496348 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.496935 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" event={"ID":"0cabb338-c4a1-41b4-abd6-d535b0e88406","Type":"ContainerStarted","Data":"9ff280c57a2ad9bfdcbb64fbf25906d01b4b4260b123d21e25aba910387259b1"} Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.503181 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.503245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.503428 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.503503 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:39.503484862 +0000 UTC m=+858.645082639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.503925 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.503962 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:39.503950145 +0000 UTC m=+858.645547922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.545367 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" event={"ID":"04cbcc0c-4375-44f0-9461-b43492e9d95b","Type":"ContainerStarted","Data":"31114467e544a77d7142ca6664ad0354ae2be9de3f038de88002af551675944c"} Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.590939 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd"] Jan 27 11:34:38 crc kubenswrapper[4775]: W0127 11:34:38.593338 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703a739a_6687_4324_b937_7d0efe7c143b.slice/crio-73c6e8ea67e58041f7bf63e1f0103ad7a8e0b4bdd083a7454f90fad557c5a6e4 WatchSource:0}: Error finding container 73c6e8ea67e58041f7bf63e1f0103ad7a8e0b4bdd083a7454f90fad557c5a6e4: Status 404 returned error can't find the container with id 73c6e8ea67e58041f7bf63e1f0103ad7a8e0b4bdd083a7454f90fad557c5a6e4 Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.717879 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z"] Jan 27 11:34:38 crc kubenswrapper[4775]: W0127 11:34:38.723035 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb296a3cd_1dc1_4511_af7a_7b1801e23e61.slice/crio-db061908966343139d0ca441bab58e2df1b17b0f1c4361cbfb94a5d486df2737 WatchSource:0}: Error finding container db061908966343139d0ca441bab58e2df1b17b0f1c4361cbfb94a5d486df2737: Status 404 returned error can't find the container with id db061908966343139d0ca441bab58e2df1b17b0f1c4361cbfb94a5d486df2737 Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.745042 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.770100 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.775847 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg"] Jan 27 11:34:38 crc kubenswrapper[4775]: W0127 11:34:38.781620 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e719fbd_ac18_4ae1_bac6_c42f1e081daa.slice/crio-b2e2dd5c0acb7d4766cfb25a7635776ed6f6b787167dd14e301ab9619bda78f1 WatchSource:0}: Error finding container b2e2dd5c0acb7d4766cfb25a7635776ed6f6b787167dd14e301ab9619bda78f1: Status 404 returned error can't find the container with id b2e2dd5c0acb7d4766cfb25a7635776ed6f6b787167dd14e301ab9619bda78f1 Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.882318 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.927610 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t"] Jan 27 11:34:38 crc kubenswrapper[4775]: W0127 11:34:38.931420 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod701902fe_7e51_44b6_923b_0a60c96d6707.slice/crio-cc1b9069ecc3964c9234c9be31a0a02608ff78354a4ff534224bfa477bba88d3 WatchSource:0}: Error finding container cc1b9069ecc3964c9234c9be31a0a02608ff78354a4ff534224bfa477bba88d3: Status 404 returned error can't find the container with id cc1b9069ecc3964c9234c9be31a0a02608ff78354a4ff534224bfa477bba88d3 Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.935977 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.940535 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.945488 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt"] Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.952320 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/neutron-operator@sha256:949870b350604b04062be6d035099ea54982d663328fe1604123fbadfad20a89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrhjl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-569695f6c5-pmk9t_openstack-operators(6bcdd59a-9739-40e7-9625-3e56009dcbd7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.953605 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" podUID="6bcdd59a-9739-40e7-9625-3e56009dcbd7" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.008713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.008895 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.008943 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:41.008930206 +0000 UTC m=+860.150527983 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.083177 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g"] Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.095369 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4"] Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.113311 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97"] Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.115475 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/nova-operator@sha256:9c0272b9043057e7fd740843e11c951ce93d5169298ed91aa8a60a702649f7cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v2v5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74ffd97575-cln8g_openstack-operators(2a55fa83-c395-4ac2-bc2e-355ad48a4a95): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.115750 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-knm5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-g5nsq_openstack-operators(a5e8d398-7976-4603-8409-304fa193f7f1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.116930 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" podUID="2a55fa83-c395-4ac2-bc2e-355ad48a4a95" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.117181 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" podUID="a5e8d398-7976-4603-8409-304fa193f7f1" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.117360 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/placement-operator@sha256:a40693d0a2ee7b50ff5b2bd339bc0ce358ccc16309e803e40d8b26e189a2b4c0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lfq7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-7748d79f84-vmtx4_openstack-operators(e14198f0-3413-4350-bae5-33b23ceead05): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.119567 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" podUID="e14198f0-3413-4350-bae5-33b23ceead05" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.123658 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq"] Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.140118 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:4e3d234c1398039c2593611f7b0fd2a6b284cafb1563e6737876a265b9af42b6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7zwgt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6c866cfdcb-2mz97_openstack-operators(5070c545-d4c0-46b3-afb9-c130dc982406): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.142067 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" podUID="5070c545-d4c0-46b3-afb9-c130dc982406" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.143065 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8"] Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.371771 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.371903 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.371945 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:41.371932719 +0000 UTC m=+860.513530496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.565764 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" event={"ID":"bea84175-0947-45e5-a635-b7d32a0442c6","Type":"ContainerStarted","Data":"de0d07df41b70f712f4495a5c9d10b333e9dfe524846a62c156a15afac6ffbf2"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.570906 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" event={"ID":"6bcdd59a-9739-40e7-9625-3e56009dcbd7","Type":"ContainerStarted","Data":"7e2a71620a96de3c0d78581b7109e8a2dd456a79ea833aa315e2fa5f03de02b1"} Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.575846 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/neutron-operator@sha256:949870b350604b04062be6d035099ea54982d663328fe1604123fbadfad20a89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" podUID="6bcdd59a-9739-40e7-9625-3e56009dcbd7" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.579670 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" event={"ID":"6c5084e4-b0e1-46fd-ae69-c0f2ede3db17","Type":"ContainerStarted","Data":"6b0c0397fd7e8f15f43be57a5446f9a1ee7dd2e74b91a1285468ea9820c3f6d3"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.579921 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.579972 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.580157 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.580279 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.580359 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:41.58027714 +0000 UTC m=+860.721874917 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.580394 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:41.580387852 +0000 UTC m=+860.721985629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.581161 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" event={"ID":"7df5397d-0c1f-46b4-8695-d80c752ca569","Type":"ContainerStarted","Data":"3f9cab5092ec3e0828421e5d32caa592c31724a0584923795e7daff3e6b69c0a"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.583172 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" event={"ID":"701902fe-7e51-44b6-923b-0a60c96d6707","Type":"ContainerStarted","Data":"cc1b9069ecc3964c9234c9be31a0a02608ff78354a4ff534224bfa477bba88d3"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.585193 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" event={"ID":"5070c545-d4c0-46b3-afb9-c130dc982406","Type":"ContainerStarted","Data":"05cb3cba9a63ad4223e70482b58fa699ab2d97e4e9f695bd16501c8a0c6e52e5"} Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.587357 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:4e3d234c1398039c2593611f7b0fd2a6b284cafb1563e6737876a265b9af42b6\\\"\"" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" podUID="5070c545-d4c0-46b3-afb9-c130dc982406" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.589089 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" event={"ID":"01a03f23-ead5-4a15-976f-4dda2622083b","Type":"ContainerStarted","Data":"a12385f6b6a14bb0efff0b292cd456d54d060b0815340f2f6c58a2d14f2fb4c2"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.590862 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" event={"ID":"a5e8d398-7976-4603-8409-304fa193f7f1","Type":"ContainerStarted","Data":"a127703edaf726d8d0a6759706c1413eadce88c4fadbc649f9181ce194d0df53"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.593751 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" event={"ID":"56fb2890-7d29-452c-9f24-4aa20d977f0b","Type":"ContainerStarted","Data":"2a8c518ad4c1694cfba4686074c46253f254793efa03210cd6c86e60c6b2b54a"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.595085 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" event={"ID":"2a55fa83-c395-4ac2-bc2e-355ad48a4a95","Type":"ContainerStarted","Data":"2df9ecf40ff4e7c5039b686e527d410aafce1e43f24d8d96f693a67d3f8b67f7"} Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.595553 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" podUID="a5e8d398-7976-4603-8409-304fa193f7f1" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.597438 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" event={"ID":"4e719fbd-ac18-4ae1-bac6-c42f1e081daa","Type":"ContainerStarted","Data":"b2e2dd5c0acb7d4766cfb25a7635776ed6f6b787167dd14e301ab9619bda78f1"} Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.597497 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/nova-operator@sha256:9c0272b9043057e7fd740843e11c951ce93d5169298ed91aa8a60a702649f7cf\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" podUID="2a55fa83-c395-4ac2-bc2e-355ad48a4a95" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.600427 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" event={"ID":"703a739a-6687-4324-b937-7d0efe7c143b","Type":"ContainerStarted","Data":"73c6e8ea67e58041f7bf63e1f0103ad7a8e0b4bdd083a7454f90fad557c5a6e4"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.602651 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" event={"ID":"dd9264fb-034f-46d3-8698-dcc6fc3470f6","Type":"ContainerStarted","Data":"0b2ec739a4678906269d7e56c4e3d973c36d50c3b9858ac7441988e3f88587d2"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.614138 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" event={"ID":"b296a3cd-1dc1-4511-af7a-7b1801e23e61","Type":"ContainerStarted","Data":"db061908966343139d0ca441bab58e2df1b17b0f1c4361cbfb94a5d486df2737"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.622758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" event={"ID":"e14198f0-3413-4350-bae5-33b23ceead05","Type":"ContainerStarted","Data":"2b30435a88fba4d79f128cebe3008c8324ac25f329488b697a57319c3079f50a"} Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.624595 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/placement-operator@sha256:a40693d0a2ee7b50ff5b2bd339bc0ce358ccc16309e803e40d8b26e189a2b4c0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" podUID="e14198f0-3413-4350-bae5-33b23ceead05" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.626602 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" event={"ID":"909c9a87-2eb1-4a52-b86d-6d36524b1eb2","Type":"ContainerStarted","Data":"4c14ed35bd38ffe59dd1cbc70bb1c4834fc14d1087a80d81e0117f38a3c9eef3"} Jan 27 11:34:40 crc kubenswrapper[4775]: E0127 11:34:40.647202 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" podUID="a5e8d398-7976-4603-8409-304fa193f7f1" Jan 27 11:34:40 crc kubenswrapper[4775]: E0127 11:34:40.648093 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/neutron-operator@sha256:949870b350604b04062be6d035099ea54982d663328fe1604123fbadfad20a89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" podUID="6bcdd59a-9739-40e7-9625-3e56009dcbd7" Jan 27 11:34:40 crc kubenswrapper[4775]: E0127 11:34:40.648855 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/placement-operator@sha256:a40693d0a2ee7b50ff5b2bd339bc0ce358ccc16309e803e40d8b26e189a2b4c0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" podUID="e14198f0-3413-4350-bae5-33b23ceead05" Jan 27 11:34:40 crc kubenswrapper[4775]: E0127 11:34:40.648924 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/nova-operator@sha256:9c0272b9043057e7fd740843e11c951ce93d5169298ed91aa8a60a702649f7cf\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" podUID="2a55fa83-c395-4ac2-bc2e-355ad48a4a95" Jan 27 11:34:40 crc kubenswrapper[4775]: E0127 11:34:40.649361 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:4e3d234c1398039c2593611f7b0fd2a6b284cafb1563e6737876a265b9af42b6\\\"\"" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" podUID="5070c545-d4c0-46b3-afb9-c130dc982406" Jan 27 11:34:41 crc kubenswrapper[4775]: I0127 11:34:41.100462 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.100653 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.100733 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:45.100714373 +0000 UTC m=+864.242312150 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: I0127 11:34:41.405222 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.405360 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.405462 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:45.405432184 +0000 UTC m=+864.547029961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: I0127 11:34:41.607960 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:41 crc kubenswrapper[4775]: I0127 11:34:41.608013 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.608247 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.608335 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:45.608316775 +0000 UTC m=+864.749914552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.608691 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.608730 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:45.608721896 +0000 UTC m=+864.750319663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: I0127 11:34:45.154269 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.154428 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.154995 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:53.154956783 +0000 UTC m=+872.296554600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: I0127 11:34:45.459952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.460091 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.460211 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:53.460197819 +0000 UTC m=+872.601795596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: I0127 11:34:45.662093 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:45 crc kubenswrapper[4775]: I0127 11:34:45.662273 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.662291 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.662414 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.662480 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:53.662462324 +0000 UTC m=+872.804060101 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.662970 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:53.662958667 +0000 UTC m=+872.804556444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:52 crc kubenswrapper[4775]: E0127 11:34:52.735441 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/designate-operator@sha256:9a27f561c9f23884b67f4fab9c8d2615b46cf4d324003a623470aa85771187d9" Jan 27 11:34:52 crc kubenswrapper[4775]: E0127 11:34:52.736481 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/designate-operator@sha256:9a27f561c9f23884b67f4fab9c8d2615b46cf4d324003a623470aa85771187d9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6nr5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-76d4d5b8f9-dvj9s_openstack-operators(c31d5b06-1ad2-4914-96c1-e0f0b8c4974e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:52 crc kubenswrapper[4775]: E0127 11:34:52.737712 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" podUID="c31d5b06-1ad2-4914-96c1-e0f0b8c4974e" Jan 27 11:34:53 crc kubenswrapper[4775]: I0127 11:34:53.181666 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.181812 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.181862 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:35:09.181848317 +0000 UTC m=+888.323446094 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.367740 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/telemetry-operator@sha256:578ea6a6c68040cb54e0160462dc2b97226594621a5f441fa1d58f429cf0e010" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.367904 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/telemetry-operator@sha256:578ea6a6c68040cb54e0160462dc2b97226594621a5f441fa1d58f429cf0e010,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dcj29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7db57dc8bf-5lbbt_openstack-operators(01a03f23-ead5-4a15-976f-4dda2622083b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.369089 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" podUID="01a03f23-ead5-4a15-976f-4dda2622083b" Jan 27 11:34:53 crc kubenswrapper[4775]: I0127 11:34:53.485275 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.485443 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.485500 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:35:09.48548748 +0000 UTC m=+888.627085257 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: I0127 11:34:53.688322 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:53 crc kubenswrapper[4775]: I0127 11:34:53.688723 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.688410 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.688917 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:35:09.688898425 +0000 UTC m=+888.830496202 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.688863 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.689078 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:35:09.689067599 +0000 UTC m=+888.830665376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.733746 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/telemetry-operator@sha256:578ea6a6c68040cb54e0160462dc2b97226594621a5f441fa1d58f429cf0e010\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" podUID="01a03f23-ead5-4a15-976f-4dda2622083b" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.733846 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/designate-operator@sha256:9a27f561c9f23884b67f4fab9c8d2615b46cf4d324003a623470aa85771187d9\\\"\"" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" podUID="c31d5b06-1ad2-4914-96c1-e0f0b8c4974e" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.040338 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/watcher-operator@sha256:611e4fb8bf6cd263664ccb437637105fba633ba8f701c228fd525a7a7b3c8d74" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.040537 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/watcher-operator@sha256:611e4fb8bf6cd263664ccb437637105fba633ba8f701c228fd525a7a7b3c8d74,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2xv9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6476466c7c-lb4h8_openstack-operators(bea84175-0947-45e5-a635-b7d32a0442c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.042333 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" podUID="bea84175-0947-45e5-a635-b7d32a0442c6" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.586267 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.586473 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lj5jn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bf6d4f946-p9vts_openstack-operators(701902fe-7e51-44b6-923b-0a60c96d6707): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.587673 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" podUID="701902fe-7e51-44b6-923b-0a60c96d6707" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.739563 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/watcher-operator@sha256:611e4fb8bf6cd263664ccb437637105fba633ba8f701c228fd525a7a7b3c8d74\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" podUID="bea84175-0947-45e5-a635-b7d32a0442c6" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.740168 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" podUID="701902fe-7e51-44b6-923b-0a60c96d6707" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.193099 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/octavia-operator@sha256:c71c081c53239338b69dc68bde59707ecafa147c81489fd755b82a9f1af402bd" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.193283 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/octavia-operator@sha256:c71c081c53239338b69dc68bde59707ecafa147c81489fd755b82a9f1af402bd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dr8zg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7bf4858b78-fcd9x_openstack-operators(7df5397d-0c1f-46b4-8695-d80c752ca569): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.194538 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" podUID="7df5397d-0c1f-46b4-8695-d80c752ca569" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.637900 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/swift-operator@sha256:018ae1352a061ad22a0d4ac5764eb7e19cf5a1d6c2e554f61ae0bd82ebe62e29" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.638063 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/swift-operator@sha256:018ae1352a061ad22a0d4ac5764eb7e19cf5a1d6c2e554f61ae0bd82ebe62e29,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gjwcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-65596dbf77-9sfp8_openstack-operators(909c9a87-2eb1-4a52-b86d-6d36524b1eb2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.639252 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" podUID="909c9a87-2eb1-4a52-b86d-6d36524b1eb2" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.745151 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/swift-operator@sha256:018ae1352a061ad22a0d4ac5764eb7e19cf5a1d6c2e554f61ae0bd82ebe62e29\\\"\"" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" podUID="909c9a87-2eb1-4a52-b86d-6d36524b1eb2" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.745219 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/octavia-operator@sha256:c71c081c53239338b69dc68bde59707ecafa147c81489fd755b82a9f1af402bd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" podUID="7df5397d-0c1f-46b4-8695-d80c752ca569" Jan 27 11:34:56 crc kubenswrapper[4775]: E0127 11:34:56.181964 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/keystone-operator@sha256:3f07fd90b18820601ae78f45a9fbef53bf9e3ed131d5cfa1d424ae0145862dd6" Jan 27 11:34:56 crc kubenswrapper[4775]: E0127 11:34:56.182472 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/keystone-operator@sha256:3f07fd90b18820601ae78f45a9fbef53bf9e3ed131d5cfa1d424ae0145862dd6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k5djl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-78f8b7b89c-2wqgg_openstack-operators(4e719fbd-ac18-4ae1-bac6-c42f1e081daa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:56 crc kubenswrapper[4775]: E0127 11:34:56.183808 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" podUID="4e719fbd-ac18-4ae1-bac6-c42f1e081daa" Jan 27 11:34:56 crc kubenswrapper[4775]: E0127 11:34:56.751240 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/keystone-operator@sha256:3f07fd90b18820601ae78f45a9fbef53bf9e3ed131d5cfa1d424ae0145862dd6\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" podUID="4e719fbd-ac18-4ae1-bac6-c42f1e081daa" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.774006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" event={"ID":"dd9264fb-034f-46d3-8698-dcc6fc3470f6","Type":"ContainerStarted","Data":"548593cc29a876f9222f1632cc88a4434c35b1898d79274985cccf44b91c7fe6"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.774633 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.775242 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" event={"ID":"0cabb338-c4a1-41b4-abd6-d535b0e88406","Type":"ContainerStarted","Data":"b00b005894f23ad2c865391e2949c5760bc0be62a93c590e72b544cb0fe412cf"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.775376 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.776772 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" event={"ID":"2a55fa83-c395-4ac2-bc2e-355ad48a4a95","Type":"ContainerStarted","Data":"e13af6ea8377cebc5d591f7e6fbe9c7804b94d0e73b83409bf1812ec8449b5b0"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.776909 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.778536 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" event={"ID":"6bcdd59a-9739-40e7-9625-3e56009dcbd7","Type":"ContainerStarted","Data":"94b0270e1c432c8f8499a7de0d29a99be4a4a802962461ad697581cdeee69138"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.778716 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.780124 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" event={"ID":"04cbcc0c-4375-44f0-9461-b43492e9d95b","Type":"ContainerStarted","Data":"7dd9730a8cbf8f49919b51773eaec381e7159052fa47e3be7cc62ecc6c641081"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.780246 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.781466 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" event={"ID":"e14198f0-3413-4350-bae5-33b23ceead05","Type":"ContainerStarted","Data":"0e67793155549f91af8a8c7045fdc4a09671f113a8b99df1e493fd31d9c32325"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.781595 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.782668 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" event={"ID":"f04fa2a0-7af2-439a-9169-6edf5be65b35","Type":"ContainerStarted","Data":"7528a4bdfcd62ca8291923b4dc41b3a10a858b5e7b24dae765425ae4d120c882"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.782792 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.784057 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" event={"ID":"6c5084e4-b0e1-46fd-ae69-c0f2ede3db17","Type":"ContainerStarted","Data":"e7e9dd09cbb8f84479bc5dfa684ca55126249450ab3f0b79dc594b1a37c92c49"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.784104 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.785274 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" event={"ID":"703a739a-6687-4324-b937-7d0efe7c143b","Type":"ContainerStarted","Data":"2846fde5c67d3282993a541e7b8228d5d80f8035ece155d120aa7b92c788f5d7"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.785376 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.786442 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" event={"ID":"56fb2890-7d29-452c-9f24-4aa20d977f0b","Type":"ContainerStarted","Data":"0babafb3000afa84eab9b817fec9baae925504384ba783e0602178d7974538d5"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.786552 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.787736 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" event={"ID":"5070c545-d4c0-46b3-afb9-c130dc982406","Type":"ContainerStarted","Data":"061d59c933ee41017ec8a52ab83bc5aa3518bc1c0de172d74a18d409a54ec297"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.787846 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.789006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" event={"ID":"b296a3cd-1dc1-4511-af7a-7b1801e23e61","Type":"ContainerStarted","Data":"3adb6b812f62ead77633d9d24d3690032be20991d85b12e9a8e6cdfffa621a10"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.789072 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.790502 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" event={"ID":"a5e8d398-7976-4603-8409-304fa193f7f1","Type":"ContainerStarted","Data":"a4d4be1769df7e26ecb3ccbd864663e043aa4f6bb7a5dc038b9f62290c7f1e85"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.799028 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" podStartSLOduration=5.186058543 podStartE2EDuration="23.799013902s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.545830289 +0000 UTC m=+857.687428066" lastFinishedPulling="2026-01-27 11:34:57.158785648 +0000 UTC m=+876.300383425" observedRunningTime="2026-01-27 11:35:00.795761914 +0000 UTC m=+879.937359691" watchObservedRunningTime="2026-01-27 11:35:00.799013902 +0000 UTC m=+879.940611679" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.827665 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" podStartSLOduration=5.740964698 podStartE2EDuration="24.827642755s" podCreationTimestamp="2026-01-27 11:34:36 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.072313257 +0000 UTC m=+857.213911034" lastFinishedPulling="2026-01-27 11:34:57.158991324 +0000 UTC m=+876.300589091" observedRunningTime="2026-01-27 11:35:00.822539225 +0000 UTC m=+879.964137032" watchObservedRunningTime="2026-01-27 11:35:00.827642755 +0000 UTC m=+879.969240532" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.847815 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" podStartSLOduration=5.038539904 podStartE2EDuration="23.847795315s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.349153577 +0000 UTC m=+857.490751354" lastFinishedPulling="2026-01-27 11:34:57.158408988 +0000 UTC m=+876.300006765" observedRunningTime="2026-01-27 11:35:00.843946249 +0000 UTC m=+879.985544026" watchObservedRunningTime="2026-01-27 11:35:00.847795315 +0000 UTC m=+879.989393092" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.864720 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" podStartSLOduration=3.151021835 podStartE2EDuration="23.864704726s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:39.115532837 +0000 UTC m=+858.257130614" lastFinishedPulling="2026-01-27 11:34:59.829215688 +0000 UTC m=+878.970813505" observedRunningTime="2026-01-27 11:35:00.863655958 +0000 UTC m=+880.005253735" watchObservedRunningTime="2026-01-27 11:35:00.864704726 +0000 UTC m=+880.006302503" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.881536 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" podStartSLOduration=3.038736259 podStartE2EDuration="23.881522806s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.952202596 +0000 UTC m=+858.093800363" lastFinishedPulling="2026-01-27 11:34:59.794989113 +0000 UTC m=+878.936586910" observedRunningTime="2026-01-27 11:35:00.87728024 +0000 UTC m=+880.018878027" watchObservedRunningTime="2026-01-27 11:35:00.881522806 +0000 UTC m=+880.023120583" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.895759 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" podStartSLOduration=6.565429795 podStartE2EDuration="24.895744764s" podCreationTimestamp="2026-01-27 11:34:36 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.324549386 +0000 UTC m=+857.466147163" lastFinishedPulling="2026-01-27 11:34:56.654864355 +0000 UTC m=+875.796462132" observedRunningTime="2026-01-27 11:35:00.894225913 +0000 UTC m=+880.035823690" watchObservedRunningTime="2026-01-27 11:35:00.895744764 +0000 UTC m=+880.037342541" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.918783 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" podStartSLOduration=5.486235222 podStartE2EDuration="23.918751193s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.726211885 +0000 UTC m=+857.867809662" lastFinishedPulling="2026-01-27 11:34:57.158727856 +0000 UTC m=+876.300325633" observedRunningTime="2026-01-27 11:35:00.915933166 +0000 UTC m=+880.057530953" watchObservedRunningTime="2026-01-27 11:35:00.918751193 +0000 UTC m=+880.060348960" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.938131 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" podStartSLOduration=4.020582015 podStartE2EDuration="23.938111012s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.751892216 +0000 UTC m=+857.893489993" lastFinishedPulling="2026-01-27 11:34:58.669421173 +0000 UTC m=+877.811018990" observedRunningTime="2026-01-27 11:35:00.937863724 +0000 UTC m=+880.079461511" watchObservedRunningTime="2026-01-27 11:35:00.938111012 +0000 UTC m=+880.079708789" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.957741 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" podStartSLOduration=3.341607861 podStartE2EDuration="23.957717337s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:39.117039848 +0000 UTC m=+858.258637625" lastFinishedPulling="2026-01-27 11:34:59.733149284 +0000 UTC m=+878.874747101" observedRunningTime="2026-01-27 11:35:00.954623402 +0000 UTC m=+880.096221179" watchObservedRunningTime="2026-01-27 11:35:00.957717337 +0000 UTC m=+880.099315114" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.994345 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" podStartSLOduration=3.341953041 podStartE2EDuration="23.994327397s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:39.115041863 +0000 UTC m=+858.256639640" lastFinishedPulling="2026-01-27 11:34:59.767416179 +0000 UTC m=+878.909013996" observedRunningTime="2026-01-27 11:35:00.979888512 +0000 UTC m=+880.121486289" watchObservedRunningTime="2026-01-27 11:35:00.994327397 +0000 UTC m=+880.135925174" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.996101 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" podStartSLOduration=3.366653525 podStartE2EDuration="23.996092465s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:39.139923773 +0000 UTC m=+858.281521550" lastFinishedPulling="2026-01-27 11:34:59.769362673 +0000 UTC m=+878.910960490" observedRunningTime="2026-01-27 11:35:00.992817886 +0000 UTC m=+880.134415673" watchObservedRunningTime="2026-01-27 11:35:00.996092465 +0000 UTC m=+880.137690242" Jan 27 11:35:01 crc kubenswrapper[4775]: I0127 11:35:01.011152 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" podStartSLOduration=6.13756365 podStartE2EDuration="24.011135686s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.78131635 +0000 UTC m=+857.922914127" lastFinishedPulling="2026-01-27 11:34:56.654888386 +0000 UTC m=+875.796486163" observedRunningTime="2026-01-27 11:35:01.007591959 +0000 UTC m=+880.149189736" watchObservedRunningTime="2026-01-27 11:35:01.011135686 +0000 UTC m=+880.152733463" Jan 27 11:35:01 crc kubenswrapper[4775]: I0127 11:35:01.030241 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" podStartSLOduration=4.463669585 podStartE2EDuration="24.030221707s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.595015452 +0000 UTC m=+857.736613229" lastFinishedPulling="2026-01-27 11:34:58.161567574 +0000 UTC m=+877.303165351" observedRunningTime="2026-01-27 11:35:01.025864818 +0000 UTC m=+880.167462595" watchObservedRunningTime="2026-01-27 11:35:01.030221707 +0000 UTC m=+880.171819484" Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.828963 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" event={"ID":"c31d5b06-1ad2-4914-96c1-e0f0b8c4974e","Type":"ContainerStarted","Data":"967e70286a376c5cff07b0e24d07f343dd672d3fde789d839386b64ef515fb7f"} Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.829850 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.831317 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" event={"ID":"01a03f23-ead5-4a15-976f-4dda2622083b","Type":"ContainerStarted","Data":"243a0b9063d35bc448b8e7118de9b12c1c5c081d0843da8c2d20773d44ae4d89"} Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.831603 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.841339 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" podStartSLOduration=1.897954586 podStartE2EDuration="29.841324048s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.349445156 +0000 UTC m=+857.491042933" lastFinishedPulling="2026-01-27 11:35:06.292814618 +0000 UTC m=+885.434412395" observedRunningTime="2026-01-27 11:35:06.841231346 +0000 UTC m=+885.982829163" watchObservedRunningTime="2026-01-27 11:35:06.841324048 +0000 UTC m=+885.982921825" Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.872690 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" podStartSLOduration=2.683120188 podStartE2EDuration="29.872667264s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.950626604 +0000 UTC m=+858.092224381" lastFinishedPulling="2026-01-27 11:35:06.14017368 +0000 UTC m=+885.281771457" observedRunningTime="2026-01-27 11:35:06.862561798 +0000 UTC m=+886.004159565" watchObservedRunningTime="2026-01-27 11:35:06.872667264 +0000 UTC m=+886.014265041" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.403038 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.427266 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.470387 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.497327 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.529066 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.577092 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.736195 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.775081 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.795378 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.854754 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.855629 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.985356 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:35:08 crc kubenswrapper[4775]: I0127 11:35:08.866055 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" event={"ID":"4e719fbd-ac18-4ae1-bac6-c42f1e081daa","Type":"ContainerStarted","Data":"772ec2f550b879ad4921a83a00a00bb60ff43cf4936267420136d3c27fcc9202"} Jan 27 11:35:08 crc kubenswrapper[4775]: I0127 11:35:08.866668 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:35:08 crc kubenswrapper[4775]: I0127 11:35:08.894184 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" podStartSLOduration=2.440005498 podStartE2EDuration="31.894165951s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.783868559 +0000 UTC m=+857.925466336" lastFinishedPulling="2026-01-27 11:35:08.238029022 +0000 UTC m=+887.379626789" observedRunningTime="2026-01-27 11:35:08.88166509 +0000 UTC m=+888.023262907" watchObservedRunningTime="2026-01-27 11:35:08.894165951 +0000 UTC m=+888.035763728" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.217662 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.225256 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.446442 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.521507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.532145 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.724294 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.724584 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.732365 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.732945 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.760822 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.883905 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk"] Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.017483 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.024191 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8"] Jan 27 11:35:10 crc kubenswrapper[4775]: W0127 11:35:10.033652 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e47cb1c_7f01_4b8d_904f_fed543678a02.slice/crio-74be6881a5a8b8d9b79e8653ce9ca4787fab7c38b28f8c5b3e7fd6395fe57f9b WatchSource:0}: Error finding container 74be6881a5a8b8d9b79e8653ce9ca4787fab7c38b28f8c5b3e7fd6395fe57f9b: Status 404 returned error can't find the container with id 74be6881a5a8b8d9b79e8653ce9ca4787fab7c38b28f8c5b3e7fd6395fe57f9b Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.224953 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k"] Jan 27 11:35:10 crc kubenswrapper[4775]: W0127 11:35:10.234631 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ecfe007_a4bf_4c31_bc83_36f4c5f00815.slice/crio-788c6c64ec3b818946b6eea66278745be8afe4e509e78d684645f20d24cebcaf WatchSource:0}: Error finding container 788c6c64ec3b818946b6eea66278745be8afe4e509e78d684645f20d24cebcaf: Status 404 returned error can't find the container with id 788c6c64ec3b818946b6eea66278745be8afe4e509e78d684645f20d24cebcaf Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.892246 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" event={"ID":"bea84175-0947-45e5-a635-b7d32a0442c6","Type":"ContainerStarted","Data":"29ad418aeb425efe4b3c4f68a14c6fcf53bcf8fa12afa24c93e327cb1555d94e"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.892492 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.895603 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" event={"ID":"3e47cb1c-7f01-4b8d-904f-fed543678a02","Type":"ContainerStarted","Data":"74be6881a5a8b8d9b79e8653ce9ca4787fab7c38b28f8c5b3e7fd6395fe57f9b"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.926737 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" podStartSLOduration=2.706405844 podStartE2EDuration="33.926721691s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:39.128522802 +0000 UTC m=+858.270120579" lastFinishedPulling="2026-01-27 11:35:10.348838649 +0000 UTC m=+889.490436426" observedRunningTime="2026-01-27 11:35:10.923873533 +0000 UTC m=+890.065471310" watchObservedRunningTime="2026-01-27 11:35:10.926721691 +0000 UTC m=+890.068319468" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.931699 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" event={"ID":"909c9a87-2eb1-4a52-b86d-6d36524b1eb2","Type":"ContainerStarted","Data":"8cf2c4e631781a2b74c75ba33fe1e4854d82020b9e1395e4d25b4627fff1b91b"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.932630 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.958391 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" podStartSLOduration=2.502451154 podStartE2EDuration="33.958375485s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.89556646 +0000 UTC m=+858.037164237" lastFinishedPulling="2026-01-27 11:35:10.351490771 +0000 UTC m=+889.493088568" observedRunningTime="2026-01-27 11:35:10.95711812 +0000 UTC m=+890.098715897" watchObservedRunningTime="2026-01-27 11:35:10.958375485 +0000 UTC m=+890.099973262" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.960633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" event={"ID":"2ecfe007-a4bf-4c31-bc83-36f4c5f00815","Type":"ContainerStarted","Data":"5162c94e4d7380d09ba424c0ef305ff23a4a24e167a2a891157a7753e88d2421"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.960666 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" event={"ID":"2ecfe007-a4bf-4c31-bc83-36f4c5f00815","Type":"ContainerStarted","Data":"788c6c64ec3b818946b6eea66278745be8afe4e509e78d684645f20d24cebcaf"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.961416 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.973479 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" event={"ID":"7df5397d-0c1f-46b4-8695-d80c752ca569","Type":"ContainerStarted","Data":"85b6560e297a54acf133a7e81757a18f504d9a16cff2cdaa7bb153806099d042"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.974132 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.980122 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" event={"ID":"701902fe-7e51-44b6-923b-0a60c96d6707","Type":"ContainerStarted","Data":"e6fc78a1889c65e6fd8d1767fd2c7f07389fd3e7b8e07261244947755adca48e"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.980651 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.984998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" event={"ID":"0da235e3-e76a-408f-8e0e-3cdd7ce76705","Type":"ContainerStarted","Data":"747ec97e4507215c75ea2ae6bd21b95c09cf3cdea75c831ed344535fae7d07a9"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.998857 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" podStartSLOduration=33.99884029 podStartE2EDuration="33.99884029s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:35:10.99519671 +0000 UTC m=+890.136794487" watchObservedRunningTime="2026-01-27 11:35:10.99884029 +0000 UTC m=+890.140438067" Jan 27 11:35:11 crc kubenswrapper[4775]: I0127 11:35:11.064372 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" podStartSLOduration=2.773367003 podStartE2EDuration="34.064357639s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.933764413 +0000 UTC m=+858.075362190" lastFinishedPulling="2026-01-27 11:35:10.224755059 +0000 UTC m=+889.366352826" observedRunningTime="2026-01-27 11:35:11.032641653 +0000 UTC m=+890.174239430" watchObservedRunningTime="2026-01-27 11:35:11.064357639 +0000 UTC m=+890.205955416" Jan 27 11:35:11 crc kubenswrapper[4775]: I0127 11:35:11.769274 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" podStartSLOduration=3.440730008 podStartE2EDuration="34.769237619s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.950613013 +0000 UTC m=+858.092210790" lastFinishedPulling="2026-01-27 11:35:10.279120604 +0000 UTC m=+889.420718401" observedRunningTime="2026-01-27 11:35:11.073712245 +0000 UTC m=+890.215310022" watchObservedRunningTime="2026-01-27 11:35:11.769237619 +0000 UTC m=+890.910835416" Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.009569 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" event={"ID":"0da235e3-e76a-408f-8e0e-3cdd7ce76705","Type":"ContainerStarted","Data":"5dbb2248bb9af2ed3e66020381d43a322be0784e17193bbc6fc65a4e3101b6aa"} Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.009863 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.012129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" event={"ID":"3e47cb1c-7f01-4b8d-904f-fed543678a02","Type":"ContainerStarted","Data":"ef164437dd6589a87691e5731309ec83dd85a93f5d9cf05212a2c7423583c4b4"} Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.012223 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.029311 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" podStartSLOduration=33.124172622 podStartE2EDuration="36.029296301s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:35:09.897954485 +0000 UTC m=+889.039552262" lastFinishedPulling="2026-01-27 11:35:12.803078164 +0000 UTC m=+891.944675941" observedRunningTime="2026-01-27 11:35:13.026911177 +0000 UTC m=+892.168508954" watchObservedRunningTime="2026-01-27 11:35:13.029296301 +0000 UTC m=+892.170894078" Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.059585 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" podStartSLOduration=33.287000699000004 podStartE2EDuration="36.059564648s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:35:10.035694006 +0000 UTC m=+889.177291783" lastFinishedPulling="2026-01-27 11:35:12.808257955 +0000 UTC m=+891.949855732" observedRunningTime="2026-01-27 11:35:13.053512953 +0000 UTC m=+892.195110730" watchObservedRunningTime="2026-01-27 11:35:13.059564648 +0000 UTC m=+892.201162425" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.458438 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.680791 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.813961 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.838769 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.929619 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.942786 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:35:18 crc kubenswrapper[4775]: I0127 11:35:18.197270 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:35:19 crc kubenswrapper[4775]: I0127 11:35:19.459595 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:35:19 crc kubenswrapper[4775]: I0127 11:35:19.773749 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:35:20 crc kubenswrapper[4775]: I0127 11:35:20.022700 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:29 crc kubenswrapper[4775]: I0127 11:35:29.517443 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:35:29 crc kubenswrapper[4775]: I0127 11:35:29.517996 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.401608 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.403104 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.404902 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.405047 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zhqjq" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.405245 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.405463 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.447614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.489004 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.491252 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.493733 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.496867 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.522231 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.522301 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jg7w\" (UniqueName: \"kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.623635 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.623687 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.624175 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.624625 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jg7w\" (UniqueName: \"kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.624758 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph6kn\" (UniqueName: \"kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.625937 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.648689 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jg7w\" (UniqueName: \"kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.724256 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.725627 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.725669 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.725759 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph6kn\" (UniqueName: \"kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.726761 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.726812 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.741592 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph6kn\" (UniqueName: \"kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.814311 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:36 crc kubenswrapper[4775]: I0127 11:35:36.194013 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:35:36 crc kubenswrapper[4775]: I0127 11:35:36.194589 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:35:36 crc kubenswrapper[4775]: I0127 11:35:36.261405 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:35:36 crc kubenswrapper[4775]: W0127 11:35:36.262602 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7b04272_f555_4b48_8702_64db912ff8e8.slice/crio-d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1 WatchSource:0}: Error finding container d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1: Status 404 returned error can't find the container with id d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1 Jan 27 11:35:37 crc kubenswrapper[4775]: I0127 11:35:37.178860 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" event={"ID":"b7b04272-f555-4b48-8702-64db912ff8e8","Type":"ContainerStarted","Data":"d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1"} Jan 27 11:35:37 crc kubenswrapper[4775]: I0127 11:35:37.181601 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" event={"ID":"5cc2363e-089b-47c6-bb51-769dc3b41aef","Type":"ContainerStarted","Data":"a8604ffeb677e0cea30bef20c89a17cd47defbaf98fc694f2a3e1671b6855986"} Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.037120 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.067639 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.068881 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.072651 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.176999 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.177083 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.177232 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9dg2\" (UniqueName: \"kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.279239 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9dg2\" (UniqueName: \"kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.279391 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.280572 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.280734 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.281354 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.343133 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9dg2\" (UniqueName: \"kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.372614 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.393943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.400607 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.402541 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.407141 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.488630 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkn2z\" (UniqueName: \"kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.488724 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.488791 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.594296 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.594362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.594437 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkn2z\" (UniqueName: \"kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.595831 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.596579 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.629802 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkn2z\" (UniqueName: \"kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.748044 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.968084 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:35:38 crc kubenswrapper[4775]: W0127 11:35:38.974633 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7196167_1cda_485b_9bec_36ab0e666568.slice/crio-23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e WatchSource:0}: Error finding container 23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e: Status 404 returned error can't find the container with id 23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.198056 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" event={"ID":"c7196167-1cda-485b-9bec-36ab0e666568","Type":"ContainerStarted","Data":"23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e"} Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.228522 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.229616 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.231928 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.232127 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.232260 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-44htb" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.232389 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.238941 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.239369 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.239563 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.246416 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.255464 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:35:39 crc kubenswrapper[4775]: W0127 11:35:39.277842 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0ffffa8_8199_4d59_927b_5563eda147fd.slice/crio-3b01344aca1f47a063297fbb9583a3d253a510b103b9f161a6d9fe9205de60d6 WatchSource:0}: Error finding container 3b01344aca1f47a063297fbb9583a3d253a510b103b9f161a6d9fe9205de60d6: Status 404 returned error can't find the container with id 3b01344aca1f47a063297fbb9583a3d253a510b103b9f161a6d9fe9205de60d6 Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340298 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340350 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340427 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340473 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340502 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnfd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340590 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340672 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340773 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340803 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441729 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441786 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441815 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441843 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnfd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441947 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441976 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442015 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442085 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442503 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442803 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.443148 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.443854 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.446669 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.450253 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.450393 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.454030 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.466993 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnfd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.466070 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.475643 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.524567 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.529595 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.532255 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gp9fv" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.532278 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.532378 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.532691 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.534999 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.535180 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.536214 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.538889 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.562384 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646038 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646087 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646115 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646149 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646193 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rgjg\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646229 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646255 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646320 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646383 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747532 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747889 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747917 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747949 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rgjg\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747974 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748032 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748047 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748079 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748099 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748113 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748843 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.749371 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.749528 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.750022 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.750732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.753218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.765387 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.768360 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.771313 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rgjg\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.772315 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.778403 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.854153 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.117378 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:35:40 crc kubenswrapper[4775]: W0127 11:35:40.151307 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ba029b_2296_4519_b6b1_04674355258f.slice/crio-3269a97665006c13d48ba616c9cd7abaebd71e3a1886cb0e13cd8dcf70fd57ec WatchSource:0}: Error finding container 3269a97665006c13d48ba616c9cd7abaebd71e3a1886cb0e13cd8dcf70fd57ec: Status 404 returned error can't find the container with id 3269a97665006c13d48ba616c9cd7abaebd71e3a1886cb0e13cd8dcf70fd57ec Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.232007 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" event={"ID":"a0ffffa8-8199-4d59-927b-5563eda147fd","Type":"ContainerStarted","Data":"3b01344aca1f47a063297fbb9583a3d253a510b103b9f161a6d9fe9205de60d6"} Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.233621 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerStarted","Data":"3269a97665006c13d48ba616c9cd7abaebd71e3a1886cb0e13cd8dcf70fd57ec"} Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.394641 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:35:40 crc kubenswrapper[4775]: W0127 11:35:40.408210 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83263987_4e3c_4e95_9083_bb6a43f52410.slice/crio-85a690e91079df6f4fe47bd15cd231753c08767dae9db9e6943a0ce49bec3588 WatchSource:0}: Error finding container 85a690e91079df6f4fe47bd15cd231753c08767dae9db9e6943a0ce49bec3588: Status 404 returned error can't find the container with id 85a690e91079df6f4fe47bd15cd231753c08767dae9db9e6943a0ce49bec3588 Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.681812 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.686129 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.688385 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.688424 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.688564 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-h6vhf" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.690957 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.691880 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.694381 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760847 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kolla-config\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760910 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-default\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760935 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760956 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760975 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4t2k\" (UniqueName: \"kubernetes.io/projected/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kube-api-access-z4t2k\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760988 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.761021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.761043 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861778 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kolla-config\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861836 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-default\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861859 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861882 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861906 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4t2k\" (UniqueName: \"kubernetes.io/projected/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kube-api-access-z4t2k\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861922 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861945 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861963 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.864006 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.864420 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kolla-config\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.865130 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.865968 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.866049 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-default\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.881314 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.881853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.891700 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4t2k\" (UniqueName: \"kubernetes.io/projected/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kube-api-access-z4t2k\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.894861 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:41 crc kubenswrapper[4775]: I0127 11:35:41.011518 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 27 11:35:41 crc kubenswrapper[4775]: I0127 11:35:41.248211 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerStarted","Data":"85a690e91079df6f4fe47bd15cd231753c08767dae9db9e6943a0ce49bec3588"} Jan 27 11:35:41 crc kubenswrapper[4775]: I0127 11:35:41.598890 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.041675 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.045481 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.048364 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.048407 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.049160 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.049524 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-t4hmx" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.067743 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.096136 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.096842 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097261 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7pv\" (UniqueName: \"kubernetes.io/projected/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kube-api-access-pq7pv\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097493 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097646 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.194895 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.197731 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.199509 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dhmh7" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201057 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201118 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201177 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201218 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201239 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7pv\" (UniqueName: \"kubernetes.io/projected/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kube-api-access-pq7pv\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201351 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201412 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.205296 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.206560 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.206749 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.206919 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.207222 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.208236 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.208431 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.223268 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.223678 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7pv\" (UniqueName: \"kubernetes.io/projected/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kube-api-access-pq7pv\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.230286 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.255163 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.261966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.307813 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-combined-ca-bundle\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.307865 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggltd\" (UniqueName: \"kubernetes.io/projected/07cc1808-c408-433d-aefa-f603408de606-kube-api-access-ggltd\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.308173 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-kolla-config\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.308372 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-config-data\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.308486 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-memcached-tls-certs\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.375800 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.409804 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-kolla-config\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.409983 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-config-data\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.410050 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-memcached-tls-certs\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.410080 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-combined-ca-bundle\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.410111 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggltd\" (UniqueName: \"kubernetes.io/projected/07cc1808-c408-433d-aefa-f603408de606-kube-api-access-ggltd\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.411111 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-kolla-config\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.411771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-config-data\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.418883 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-memcached-tls-certs\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.422082 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-combined-ca-bundle\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.426762 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggltd\" (UniqueName: \"kubernetes.io/projected/07cc1808-c408-433d-aefa-f603408de606-kube-api-access-ggltd\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.609912 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.328819 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.329957 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.333422 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vjlch" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.342065 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzdzx\" (UniqueName: \"kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx\") pod \"kube-state-metrics-0\" (UID: \"d650e06f-8d9a-443d-9045-82cef3c36ad3\") " pod="openstack/kube-state-metrics-0" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.347810 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.442538 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzdzx\" (UniqueName: \"kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx\") pod \"kube-state-metrics-0\" (UID: \"d650e06f-8d9a-443d-9045-82cef3c36ad3\") " pod="openstack/kube-state-metrics-0" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.493204 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzdzx\" (UniqueName: \"kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx\") pod \"kube-state-metrics-0\" (UID: \"d650e06f-8d9a-443d-9045-82cef3c36ad3\") " pod="openstack/kube-state-metrics-0" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.654278 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.331078 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9bafbfb6-d113-4a0f-a1dd-0d001a5448de","Type":"ContainerStarted","Data":"faa33ccf16c94ee677fe261429dec55ad1899914247eb8bf91e4fa85aee22616"} Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.853228 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.854382 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.857254 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-nb9n9" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.857486 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.857500 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.857634 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.857733 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.869095 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899054 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899095 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7vvw\" (UniqueName: \"kubernetes.io/projected/09719e3d-fd6c-4c22-8c15-8ef911bc6598-kube-api-access-d7vvw\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899123 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899175 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899212 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-config\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899232 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899259 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899304 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.000989 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001051 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7vvw\" (UniqueName: \"kubernetes.io/projected/09719e3d-fd6c-4c22-8c15-8ef911bc6598-kube-api-access-d7vvw\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001084 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001143 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001183 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-config\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001204 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001231 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001282 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001699 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.002121 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.002489 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-config\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.002845 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.008368 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.008686 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.010270 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.019229 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7vvw\" (UniqueName: \"kubernetes.io/projected/09719e3d-fd6c-4c22-8c15-8ef911bc6598-kube-api-access-d7vvw\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.029591 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.186943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.246267 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.249190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.268866 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.315673 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.315758 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpgqk\" (UniqueName: \"kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.315923 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.417022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.417114 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.417141 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpgqk\" (UniqueName: \"kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.417573 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.417585 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.437202 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpgqk\" (UniqueName: \"kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.583082 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.586428 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4hqln"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.587639 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.589622 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nj7nm" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.589958 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.593891 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.596636 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-l9blz"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.597974 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.604879 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4hqln"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620182 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-combined-ca-bundle\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620230 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-run\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620256 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-lib\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620270 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-log-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-ovn-controller-tls-certs\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620314 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620331 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620368 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b06b991d-b108-4b21-82e5-43b3662c7aee-scripts\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620399 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cacc7142-a8d4-4607-adb7-0090fbd3024a-scripts\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620422 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-log\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620464 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn9rv\" (UniqueName: \"kubernetes.io/projected/b06b991d-b108-4b21-82e5-43b3662c7aee-kube-api-access-dn9rv\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620507 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-etc-ovs\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tsv\" (UniqueName: \"kubernetes.io/projected/cacc7142-a8d4-4607-adb7-0090fbd3024a-kube-api-access-k7tsv\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.622000 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-l9blz"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-ovn-controller-tls-certs\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722398 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722425 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722478 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b06b991d-b108-4b21-82e5-43b3662c7aee-scripts\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cacc7142-a8d4-4607-adb7-0090fbd3024a-scripts\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722537 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-log\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn9rv\" (UniqueName: \"kubernetes.io/projected/b06b991d-b108-4b21-82e5-43b3662c7aee-kube-api-access-dn9rv\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722610 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-etc-ovs\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722647 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tsv\" (UniqueName: \"kubernetes.io/projected/cacc7142-a8d4-4607-adb7-0090fbd3024a-kube-api-access-k7tsv\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-combined-ca-bundle\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-run\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722720 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-lib\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-log-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.723291 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-log-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.723417 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-run\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.723492 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-lib\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.723542 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.723984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.724142 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-etc-ovs\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.724260 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-log\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.726071 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cacc7142-a8d4-4607-adb7-0090fbd3024a-scripts\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.732806 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-combined-ca-bundle\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.737008 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-ovn-controller-tls-certs\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.742057 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tsv\" (UniqueName: \"kubernetes.io/projected/cacc7142-a8d4-4607-adb7-0090fbd3024a-kube-api-access-k7tsv\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.742125 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn9rv\" (UniqueName: \"kubernetes.io/projected/b06b991d-b108-4b21-82e5-43b3662c7aee-kube-api-access-dn9rv\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.742425 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b06b991d-b108-4b21-82e5-43b3662c7aee-scripts\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.906212 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.915358 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.038234 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.039735 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.042463 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.042687 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-gvhkg" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.042849 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.045161 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.058205 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.161930 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162011 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162042 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162068 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-config\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162138 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162208 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knqj7\" (UniqueName: \"kubernetes.io/projected/fb252ada-9191-4d2d-8ab9-d12f4668a35a-kube-api-access-knqj7\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.234023 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.236036 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.241878 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.268139 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270847 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knqj7\" (UniqueName: \"kubernetes.io/projected/fb252ada-9191-4d2d-8ab9-d12f4668a35a-kube-api-access-knqj7\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270883 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270938 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270963 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270982 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.271734 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.271809 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-config\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.272767 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-config\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.273434 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.274022 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.283208 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.294838 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.298605 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knqj7\" (UniqueName: \"kubernetes.io/projected/fb252ada-9191-4d2d-8ab9-d12f4668a35a-kube-api-access-knqj7\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.300175 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.309925 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.367622 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.374126 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.374202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.374248 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjzm\" (UniqueName: \"kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.476240 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjzm\" (UniqueName: \"kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.476384 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.476422 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.476880 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.480246 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.497204 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjzm\" (UniqueName: \"kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.552889 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:57 crc kubenswrapper[4775]: E0127 11:35:57.294433 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d" Jan 27 11:35:57 crc kubenswrapper[4775]: E0127 11:35:57.295264 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2rgjg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(83263987-4e3c-4e95-9083-bb6a43f52410): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:35:57 crc kubenswrapper[4775]: E0127 11:35:57.296509 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" Jan 27 11:35:57 crc kubenswrapper[4775]: E0127 11:35:57.403905 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.070100 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.070307 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkn2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-95f5f6995-bzxbb_openstack(a0ffffa8-8199-4d59-927b-5563eda147fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.071585 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.079869 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.079980 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d9dg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-744ffd65bc-4xzdj_openstack(c7196167-1cda-485b-9bec-36ab0e666568): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.081316 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" podUID="c7196167-1cda-485b-9bec-36ab0e666568" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.099716 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.099844 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jg7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84bb9d8bd9-vkn58_openstack(5cc2363e-089b-47c6-bb51-769dc3b41aef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.101045 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" podUID="5cc2363e-089b-47c6-bb51-769dc3b41aef" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.138269 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.138444 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ph6kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f854695bc-wkh2v_openstack(b7b04272-f555-4b48-8702-64db912ff8e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.139677 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" podUID="b7b04272-f555-4b48-8702-64db912ff8e8" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.403687 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" podUID="c7196167-1cda-485b-9bec-36ab0e666568" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.404115 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" Jan 27 11:35:59 crc kubenswrapper[4775]: I0127 11:35:59.518094 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:35:59 crc kubenswrapper[4775]: I0127 11:35:59.518401 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.417497 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" event={"ID":"5cc2363e-089b-47c6-bb51-769dc3b41aef","Type":"ContainerDied","Data":"a8604ffeb677e0cea30bef20c89a17cd47defbaf98fc694f2a3e1671b6855986"} Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.417794 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8604ffeb677e0cea30bef20c89a17cd47defbaf98fc694f2a3e1671b6855986" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.421385 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" event={"ID":"b7b04272-f555-4b48-8702-64db912ff8e8","Type":"ContainerDied","Data":"d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1"} Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.421430 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.494604 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.506668 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.626879 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc\") pod \"b7b04272-f555-4b48-8702-64db912ff8e8\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627257 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph6kn\" (UniqueName: \"kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn\") pod \"b7b04272-f555-4b48-8702-64db912ff8e8\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627307 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jg7w\" (UniqueName: \"kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w\") pod \"5cc2363e-089b-47c6-bb51-769dc3b41aef\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627408 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config\") pod \"b7b04272-f555-4b48-8702-64db912ff8e8\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627434 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config\") pod \"5cc2363e-089b-47c6-bb51-769dc3b41aef\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627432 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7b04272-f555-4b48-8702-64db912ff8e8" (UID: "b7b04272-f555-4b48-8702-64db912ff8e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627830 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.628002 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config" (OuterVolumeSpecName: "config") pod "b7b04272-f555-4b48-8702-64db912ff8e8" (UID: "b7b04272-f555-4b48-8702-64db912ff8e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.628283 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config" (OuterVolumeSpecName: "config") pod "5cc2363e-089b-47c6-bb51-769dc3b41aef" (UID: "5cc2363e-089b-47c6-bb51-769dc3b41aef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.634313 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w" (OuterVolumeSpecName: "kube-api-access-8jg7w") pod "5cc2363e-089b-47c6-bb51-769dc3b41aef" (UID: "5cc2363e-089b-47c6-bb51-769dc3b41aef"). InnerVolumeSpecName "kube-api-access-8jg7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.635904 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn" (OuterVolumeSpecName: "kube-api-access-ph6kn") pod "b7b04272-f555-4b48-8702-64db912ff8e8" (UID: "b7b04272-f555-4b48-8702-64db912ff8e8"). InnerVolumeSpecName "kube-api-access-ph6kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.729367 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jg7w\" (UniqueName: \"kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.729401 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.729413 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.729425 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph6kn\" (UniqueName: \"kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.232702 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4hqln"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.252057 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.260204 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.267294 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 11:36:01 crc kubenswrapper[4775]: W0127 11:36:01.278927 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6108f26d_5e0a_490c_a7a4_8cefa3b99c7d.slice/crio-9739559ce07c1c35544d9bf263fe84c6d806047f757668bd795979f346bc4b93 WatchSource:0}: Error finding container 9739559ce07c1c35544d9bf263fe84c6d806047f757668bd795979f346bc4b93: Status 404 returned error can't find the container with id 9739559ce07c1c35544d9bf263fe84c6d806047f757668bd795979f346bc4b93 Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.318185 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.348820 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.393861 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-l9blz"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.435605 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"07cc1808-c408-433d-aefa-f603408de606","Type":"ContainerStarted","Data":"325fbdc7382072689a0cdf3c2a42c3b72df96fb34e54cc553f116073ccfaaf84"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.437144 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4hqln" event={"ID":"cacc7142-a8d4-4607-adb7-0090fbd3024a","Type":"ContainerStarted","Data":"afb98490afa7e426904643804172eb216595d47e10150753b8f0072725543d31"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.438709 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d650e06f-8d9a-443d-9045-82cef3c36ad3","Type":"ContainerStarted","Data":"8c517699b915acc52e0019dc1c45d2e9a3ea6904e06f7498f332512ca9be5304"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.441937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerStarted","Data":"d68aa08b8c10efd267dbb532a84a73914540135473560968b1351b3eea784ca0"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.447516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d","Type":"ContainerStarted","Data":"9739559ce07c1c35544d9bf263fe84c6d806047f757668bd795979f346bc4b93"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.448577 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerStarted","Data":"3c9369265622ba39ffc877111bca425ea61080e3b7bb1ee8ddc44e387299ce63"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.449574 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l9blz" event={"ID":"b06b991d-b108-4b21-82e5-43b3662c7aee","Type":"ContainerStarted","Data":"2f0eac791862efa9ae20f0c460c94f3f0d3d8c62d5c60cb786f51daaccd12a67"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.450670 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.453758 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.453871 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9bafbfb6-d113-4a0f-a1dd-0d001a5448de","Type":"ContainerStarted","Data":"6530e42900cb10d3b44da1f3748193697a8d76b8e0700bb184bc17e4bf83e4a2"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.603168 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.613442 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.634576 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.645472 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.760822 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc2363e-089b-47c6-bb51-769dc3b41aef" path="/var/lib/kubelet/pods/5cc2363e-089b-47c6-bb51-769dc3b41aef/volumes" Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.764662 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b04272-f555-4b48-8702-64db912ff8e8" path="/var/lib/kubelet/pods/b7b04272-f555-4b48-8702-64db912ff8e8/volumes" Jan 27 11:36:02 crc kubenswrapper[4775]: I0127 11:36:02.210709 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 11:36:02 crc kubenswrapper[4775]: W0127 11:36:02.213707 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb252ada_9191_4d2d_8ab9_d12f4668a35a.slice/crio-d6c900e6adc6a0b4e94b147e02216692a11ad65e67d333fc459319a8154aabd6 WatchSource:0}: Error finding container d6c900e6adc6a0b4e94b147e02216692a11ad65e67d333fc459319a8154aabd6: Status 404 returned error can't find the container with id d6c900e6adc6a0b4e94b147e02216692a11ad65e67d333fc459319a8154aabd6 Jan 27 11:36:02 crc kubenswrapper[4775]: I0127 11:36:02.467086 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fb252ada-9191-4d2d-8ab9-d12f4668a35a","Type":"ContainerStarted","Data":"d6c900e6adc6a0b4e94b147e02216692a11ad65e67d333fc459319a8154aabd6"} Jan 27 11:36:02 crc kubenswrapper[4775]: I0127 11:36:02.470278 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 11:36:03 crc kubenswrapper[4775]: I0127 11:36:03.479936 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"09719e3d-fd6c-4c22-8c15-8ef911bc6598","Type":"ContainerStarted","Data":"ea65f3829f0e7ba1ea821bc67e68f186029a80db38866cc31a1e47e34ca5b8ba"} Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.497919 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerStarted","Data":"74bb5b1c930971f4fe9c5d05e3295a42d673f050d9c75ec7b42c0aa8e59510ca"} Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.503894 4775 generic.go:334] "Generic (PLEG): container finished" podID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerID="a7078e3f79ebaf3e143c08c9b5d9ba3454399bc72621c179ec98e87d8ca953ac" exitCode=0 Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.503973 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerDied","Data":"a7078e3f79ebaf3e143c08c9b5d9ba3454399bc72621c179ec98e87d8ca953ac"} Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.507490 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d","Type":"ContainerStarted","Data":"483189a828072cd5ee41fa53f904c71c2a5f7f660672f0172f3cf62b12572414"} Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.512350 4775 generic.go:334] "Generic (PLEG): container finished" podID="32c63ae2-f837-485f-9f74-0606288c3666" containerID="8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5" exitCode=0 Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.512487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerDied","Data":"8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5"} Jan 27 11:36:08 crc kubenswrapper[4775]: I0127 11:36:08.534596 4775 generic.go:334] "Generic (PLEG): container finished" podID="9bafbfb6-d113-4a0f-a1dd-0d001a5448de" containerID="6530e42900cb10d3b44da1f3748193697a8d76b8e0700bb184bc17e4bf83e4a2" exitCode=0 Jan 27 11:36:08 crc kubenswrapper[4775]: I0127 11:36:08.534677 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9bafbfb6-d113-4a0f-a1dd-0d001a5448de","Type":"ContainerDied","Data":"6530e42900cb10d3b44da1f3748193697a8d76b8e0700bb184bc17e4bf83e4a2"} Jan 27 11:36:09 crc kubenswrapper[4775]: I0127 11:36:09.550628 4775 generic.go:334] "Generic (PLEG): container finished" podID="6108f26d-5e0a-490c-a7a4-8cefa3b99c7d" containerID="483189a828072cd5ee41fa53f904c71c2a5f7f660672f0172f3cf62b12572414" exitCode=0 Jan 27 11:36:09 crc kubenswrapper[4775]: I0127 11:36:09.550690 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d","Type":"ContainerDied","Data":"483189a828072cd5ee41fa53f904c71c2a5f7f660672f0172f3cf62b12572414"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.592082 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9bafbfb6-d113-4a0f-a1dd-0d001a5448de","Type":"ContainerStarted","Data":"c88e0678799b6a493451f75ba923d4eb4c771df920e39200189f796c2acf1415"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.600796 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4hqln" event={"ID":"cacc7142-a8d4-4607-adb7-0090fbd3024a","Type":"ContainerStarted","Data":"25719052291ce0784c5cdefd82e34f01bac4601d4f6545a9d38f8ae2876c9ef3"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.600903 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-4hqln" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.604040 4775 generic.go:334] "Generic (PLEG): container finished" podID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerID="862553ece09ec7abc1ec1a84f1cafbd9dd0b4ae450db1c4c095ba98bfbf00ead" exitCode=0 Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.604192 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerDied","Data":"862553ece09ec7abc1ec1a84f1cafbd9dd0b4ae450db1c4c095ba98bfbf00ead"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.607504 4775 generic.go:334] "Generic (PLEG): container finished" podID="32c63ae2-f837-485f-9f74-0606288c3666" containerID="ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c" exitCode=0 Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.607658 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerDied","Data":"ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.614514 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.026760383 podStartE2EDuration="32.614497814s" podCreationTimestamp="2026-01-27 11:35:39 +0000 UTC" firstStartedPulling="2026-01-27 11:35:46.850752625 +0000 UTC m=+925.992350402" lastFinishedPulling="2026-01-27 11:36:00.438490046 +0000 UTC m=+939.580087833" observedRunningTime="2026-01-27 11:36:11.613586248 +0000 UTC m=+950.755184035" watchObservedRunningTime="2026-01-27 11:36:11.614497814 +0000 UTC m=+950.756095591" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.616059 4775 generic.go:334] "Generic (PLEG): container finished" podID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerID="357b113e0ab8b0acfafd5e8a4b10ed58eb7061e0cf48c4acf628d4887c7b99da" exitCode=0 Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.616108 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" event={"ID":"a0ffffa8-8199-4d59-927b-5563eda147fd","Type":"ContainerDied","Data":"357b113e0ab8b0acfafd5e8a4b10ed58eb7061e0cf48c4acf628d4887c7b99da"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.620014 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l9blz" event={"ID":"b06b991d-b108-4b21-82e5-43b3662c7aee","Type":"ContainerStarted","Data":"a990d77a3cf0838c74e707e18849c6485ec63481f4ef3b07356d3e8995dbd108"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.628501 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"09719e3d-fd6c-4c22-8c15-8ef911bc6598","Type":"ContainerStarted","Data":"2b3a4e1cf69bcdc101b9c3a20d0c3eb06f0e0a30352b47c0121c787b62e10c75"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.632084 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d650e06f-8d9a-443d-9045-82cef3c36ad3","Type":"ContainerStarted","Data":"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.632300 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.635632 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerStarted","Data":"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.649766 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d","Type":"ContainerStarted","Data":"559b79147016a36fa1d81a8459f89139f84ea56d20c9fb27270d55252a8da1c8"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.651626 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fb252ada-9191-4d2d-8ab9-d12f4668a35a","Type":"ContainerStarted","Data":"09edec2805bedc01877cbe4c65bbfc4e8b77a8b718dcb6d0a4db1de2befc8064"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.656860 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"07cc1808-c408-433d-aefa-f603408de606","Type":"ContainerStarted","Data":"df0a956ea4f5334c7744ad328c65be6ec217655067de8f9c23f23040ea40c16a"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.657029 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.687204 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4hqln" podStartSLOduration=15.149826773 podStartE2EDuration="23.687174357s" podCreationTimestamp="2026-01-27 11:35:48 +0000 UTC" firstStartedPulling="2026-01-27 11:36:01.24821295 +0000 UTC m=+940.389810727" lastFinishedPulling="2026-01-27 11:36:09.785560484 +0000 UTC m=+948.927158311" observedRunningTime="2026-01-27 11:36:11.67775071 +0000 UTC m=+950.819348507" watchObservedRunningTime="2026-01-27 11:36:11.687174357 +0000 UTC m=+950.828772154" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.724086 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.724068495 podStartE2EDuration="30.724068495s" podCreationTimestamp="2026-01-27 11:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:11.717071224 +0000 UTC m=+950.858669021" watchObservedRunningTime="2026-01-27 11:36:11.724068495 +0000 UTC m=+950.865666272" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.766796 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.691000081 podStartE2EDuration="27.766777011s" podCreationTimestamp="2026-01-27 11:35:44 +0000 UTC" firstStartedPulling="2026-01-27 11:36:01.276105071 +0000 UTC m=+940.417702838" lastFinishedPulling="2026-01-27 11:36:10.351881981 +0000 UTC m=+949.493479768" observedRunningTime="2026-01-27 11:36:11.761140247 +0000 UTC m=+950.902738024" watchObservedRunningTime="2026-01-27 11:36:11.766777011 +0000 UTC m=+950.908374788" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.807464 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.385546539 podStartE2EDuration="29.807433621s" podCreationTimestamp="2026-01-27 11:35:42 +0000 UTC" firstStartedPulling="2026-01-27 11:36:01.248300272 +0000 UTC m=+940.389898059" lastFinishedPulling="2026-01-27 11:36:09.670187354 +0000 UTC m=+948.811785141" observedRunningTime="2026-01-27 11:36:11.798255171 +0000 UTC m=+950.939852948" watchObservedRunningTime="2026-01-27 11:36:11.807433621 +0000 UTC m=+950.949031398" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.382039 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.382412 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.665086 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerStarted","Data":"c77bc3df0ef278fa6252111f5fae5b862f83e4845c5181bed73e9f84cf00a7b4"} Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.667520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerStarted","Data":"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c"} Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.675627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" event={"ID":"a0ffffa8-8199-4d59-927b-5563eda147fd","Type":"ContainerStarted","Data":"8be4620fe03275aaa5212ee572a3a0d887cfc63a9b8b6239245c1bca75f7d04b"} Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.676172 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.678491 4775 generic.go:334] "Generic (PLEG): container finished" podID="b06b991d-b108-4b21-82e5-43b3662c7aee" containerID="a990d77a3cf0838c74e707e18849c6485ec63481f4ef3b07356d3e8995dbd108" exitCode=0 Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.681618 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l9blz" event={"ID":"b06b991d-b108-4b21-82e5-43b3662c7aee","Type":"ContainerDied","Data":"a990d77a3cf0838c74e707e18849c6485ec63481f4ef3b07356d3e8995dbd108"} Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.685679 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hp57" podStartSLOduration=15.130154556 podStartE2EDuration="21.685665037s" podCreationTimestamp="2026-01-27 11:35:51 +0000 UTC" firstStartedPulling="2026-01-27 11:36:05.506423941 +0000 UTC m=+944.648021718" lastFinishedPulling="2026-01-27 11:36:12.061934422 +0000 UTC m=+951.203532199" observedRunningTime="2026-01-27 11:36:12.682008326 +0000 UTC m=+951.823606103" watchObservedRunningTime="2026-01-27 11:36:12.685665037 +0000 UTC m=+951.827262804" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.701789 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" podStartSLOduration=3.67405448 podStartE2EDuration="34.701771976s" podCreationTimestamp="2026-01-27 11:35:38 +0000 UTC" firstStartedPulling="2026-01-27 11:35:39.28006511 +0000 UTC m=+918.421662877" lastFinishedPulling="2026-01-27 11:36:10.307782596 +0000 UTC m=+949.449380373" observedRunningTime="2026-01-27 11:36:12.696793581 +0000 UTC m=+951.838391348" watchObservedRunningTime="2026-01-27 11:36:12.701771976 +0000 UTC m=+951.843369753" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.742623 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c4p9c" podStartSLOduration=17.921028783 podStartE2EDuration="24.742605281s" podCreationTimestamp="2026-01-27 11:35:48 +0000 UTC" firstStartedPulling="2026-01-27 11:36:05.51407352 +0000 UTC m=+944.655671307" lastFinishedPulling="2026-01-27 11:36:12.335650028 +0000 UTC m=+951.477247805" observedRunningTime="2026-01-27 11:36:12.742251882 +0000 UTC m=+951.883849659" watchObservedRunningTime="2026-01-27 11:36:12.742605281 +0000 UTC m=+951.884203058" Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.693202 4775 generic.go:334] "Generic (PLEG): container finished" podID="c7196167-1cda-485b-9bec-36ab0e666568" containerID="97eb2ae0d47bf6851995b105d37a65888384ea986fa2a3b3f741906dd431a2f6" exitCode=0 Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.693876 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" event={"ID":"c7196167-1cda-485b-9bec-36ab0e666568","Type":"ContainerDied","Data":"97eb2ae0d47bf6851995b105d37a65888384ea986fa2a3b3f741906dd431a2f6"} Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.710910 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fb252ada-9191-4d2d-8ab9-d12f4668a35a","Type":"ContainerStarted","Data":"91043a0286a475941ff1e2e15ebb1ad2db71ddc9330b5fcf6a29f56fdf7de1f4"} Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.715229 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l9blz" event={"ID":"b06b991d-b108-4b21-82e5-43b3662c7aee","Type":"ContainerStarted","Data":"512935f0d3a93c8d67e1544c268864b15cd99b14f0311dd4e91ce2d818961543"} Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.716701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"09719e3d-fd6c-4c22-8c15-8ef911bc6598","Type":"ContainerStarted","Data":"91f6dd66343e9ed907e5d0f090a63d8fc62fde24a06bc31cbe61712fbc467f0a"} Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.771691 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.648922913 podStartE2EDuration="24.771670295s" podCreationTimestamp="2026-01-27 11:35:50 +0000 UTC" firstStartedPulling="2026-01-27 11:36:02.215349052 +0000 UTC m=+941.356946829" lastFinishedPulling="2026-01-27 11:36:14.338096434 +0000 UTC m=+953.479694211" observedRunningTime="2026-01-27 11:36:14.760217092 +0000 UTC m=+953.901814889" watchObservedRunningTime="2026-01-27 11:36:14.771670295 +0000 UTC m=+953.913268082" Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.795890 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.957278334 podStartE2EDuration="28.795850426s" podCreationTimestamp="2026-01-27 11:35:46 +0000 UTC" firstStartedPulling="2026-01-27 11:36:02.482542359 +0000 UTC m=+941.624140136" lastFinishedPulling="2026-01-27 11:36:14.321114461 +0000 UTC m=+953.462712228" observedRunningTime="2026-01-27 11:36:14.788904166 +0000 UTC m=+953.930501953" watchObservedRunningTime="2026-01-27 11:36:14.795850426 +0000 UTC m=+953.937448313" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.187715 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.230889 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.368843 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.410043 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.726255 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" event={"ID":"c7196167-1cda-485b-9bec-36ab0e666568","Type":"ContainerStarted","Data":"ab7d80585c73c2935a1546f42ec8127d8f07e4ebfcf89fc16e590bf9f313fdc3"} Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.726646 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.730096 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l9blz" event={"ID":"b06b991d-b108-4b21-82e5-43b3662c7aee","Type":"ContainerStarted","Data":"58471d545306326985d3bc8a879bfb69c3624a87f6ef783ea2f890ec8db36211"} Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.730573 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.730635 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.756549 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" podStartSLOduration=-9223371999.098242 podStartE2EDuration="37.756533802s" podCreationTimestamp="2026-01-27 11:35:38 +0000 UTC" firstStartedPulling="2026-01-27 11:35:38.976109389 +0000 UTC m=+918.117707166" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:15.75499285 +0000 UTC m=+954.896590647" watchObservedRunningTime="2026-01-27 11:36:15.756533802 +0000 UTC m=+954.898131579" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.777581 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.790264 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.795736 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-l9blz" podStartSLOduration=19.059865316 podStartE2EDuration="27.795718272s" podCreationTimestamp="2026-01-27 11:35:48 +0000 UTC" firstStartedPulling="2026-01-27 11:36:01.42982411 +0000 UTC m=+940.571421897" lastFinishedPulling="2026-01-27 11:36:10.165677076 +0000 UTC m=+949.307274853" observedRunningTime="2026-01-27 11:36:15.79162992 +0000 UTC m=+954.933227687" watchObservedRunningTime="2026-01-27 11:36:15.795718272 +0000 UTC m=+954.937316059" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.986704 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.026161 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.027510 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.031350 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.049682 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.064344 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-9xncr"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.065466 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.067204 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.075109 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9xncr"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131615 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131707 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovs-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131753 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr7mn\" (UniqueName: \"kubernetes.io/projected/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-kube-api-access-vr7mn\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131863 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovn-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131912 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztkr9\" (UniqueName: \"kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131930 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.132011 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-combined-ca-bundle\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.132058 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-config\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.132114 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.209260 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.209622 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="dnsmasq-dns" containerID="cri-o://8be4620fe03275aaa5212ee572a3a0d887cfc63a9b8b6239245c1bca75f7d04b" gracePeriod=10 Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.210575 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.233786 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.233872 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovn-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.233921 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztkr9\" (UniqueName: \"kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.233948 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.233992 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-combined-ca-bundle\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.234022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-config\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.234056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.234110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.234134 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovs-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.234161 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr7mn\" (UniqueName: \"kubernetes.io/projected/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-kube-api-access-vr7mn\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.235615 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.235627 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.236218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-config\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.236634 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovn-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.236673 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovs-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.237192 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.243001 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.247169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-combined-ca-bundle\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.253808 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztkr9\" (UniqueName: \"kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.254514 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.256194 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr7mn\" (UniqueName: \"kubernetes.io/projected/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-kube-api-access-vr7mn\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.257439 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.268528 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.281250 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.338663 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5q2x\" (UniqueName: \"kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.339032 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.339158 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.339197 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.339231 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.347636 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.351045 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.353539 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.356218 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.356419 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4q76m" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.357073 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.359816 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.365664 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.378348 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.440656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.440755 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-scripts\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.440804 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.440832 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441648 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441822 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn72b\" (UniqueName: \"kubernetes.io/projected/6bb656eb-1eea-436d-acf3-6d8a548a97e5-kube-api-access-jn72b\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441869 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441913 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441935 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441978 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5q2x\" (UniqueName: \"kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.442015 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-config\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.442044 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.442100 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.442747 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.443272 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.444601 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.460024 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5q2x\" (UniqueName: \"kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547235 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn72b\" (UniqueName: \"kubernetes.io/projected/6bb656eb-1eea-436d-acf3-6d8a548a97e5-kube-api-access-jn72b\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547603 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547662 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-config\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547721 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547759 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547788 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-scripts\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.548558 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-scripts\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.551080 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-config\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.558193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.559582 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.560101 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.560174 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.569187 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn72b\" (UniqueName: \"kubernetes.io/projected/6bb656eb-1eea-436d-acf3-6d8a548a97e5-kube-api-access-jn72b\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.674809 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.740142 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.741632 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.741668 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.804620 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.808388 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.824607 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.855754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7c9q\" (UniqueName: \"kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.855911 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.855933 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.885219 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.928436 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9xncr"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.957615 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7c9q\" (UniqueName: \"kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.957842 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.957882 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.958394 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.958997 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.977614 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7c9q\" (UniqueName: \"kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.142497 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.178966 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:17 crc kubenswrapper[4775]: W0127 11:36:17.196749 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31d3ee22_9b3b_46ac_b896_ba5c521e1753.slice/crio-b21b311b130da3440a7a2e7074ea7f07554bbb6a824125778029cf4c67436a28 WatchSource:0}: Error finding container b21b311b130da3440a7a2e7074ea7f07554bbb6a824125778029cf4c67436a28: Status 404 returned error can't find the container with id b21b311b130da3440a7a2e7074ea7f07554bbb6a824125778029cf4c67436a28 Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.356837 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.613299 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.613762 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.773972 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerStarted","Data":"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.774010 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerStarted","Data":"b21b311b130da3440a7a2e7074ea7f07554bbb6a824125778029cf4c67436a28"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.777357 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9xncr" event={"ID":"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5","Type":"ContainerStarted","Data":"298409a96c786441a60df4dd9efea24a8e3bca3bb653c04e5c0a57d1df204821"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.777380 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9xncr" event={"ID":"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5","Type":"ContainerStarted","Data":"0c8e62c1a291e0de5a79d6048e73580e36b94e4698e9d98bcf379972996a6c37"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.780174 4775 generic.go:334] "Generic (PLEG): container finished" podID="de956838-03d3-41d8-96d3-a85293eff207" containerID="2e81b0c9e712aea129261264566fcfb84e7c96b384aef515c9ba092ed6df8a8f" exitCode=0 Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.780224 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" event={"ID":"de956838-03d3-41d8-96d3-a85293eff207","Type":"ContainerDied","Data":"2e81b0c9e712aea129261264566fcfb84e7c96b384aef515c9ba092ed6df8a8f"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.780249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" event={"ID":"de956838-03d3-41d8-96d3-a85293eff207","Type":"ContainerStarted","Data":"fd5b11d9815172e4b8d84472d53d5f5c5a67656114e7d95c656d684b7f601224"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.782914 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6bb656eb-1eea-436d-acf3-6d8a548a97e5","Type":"ContainerStarted","Data":"2bf60354b1e345338bcc540ea18be851719d1616f6dbd39f82c6fe1a3f139081"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.797940 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-9xncr" podStartSLOduration=1.7979232920000001 podStartE2EDuration="1.797923292s" podCreationTimestamp="2026-01-27 11:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:17.793146141 +0000 UTC m=+956.934743928" watchObservedRunningTime="2026-01-27 11:36:17.797923292 +0000 UTC m=+956.939521069" Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.803562 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerStarted","Data":"f2f98601e1fc4d2ec97f9e0c70c2dbd57bb16d6fdfa6d9ac4a20a475acb21242"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.811833 4775 generic.go:334] "Generic (PLEG): container finished" podID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerID="8be4620fe03275aaa5212ee572a3a0d887cfc63a9b8b6239245c1bca75f7d04b" exitCode=0 Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.811899 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" event={"ID":"a0ffffa8-8199-4d59-927b-5563eda147fd","Type":"ContainerDied","Data":"8be4620fe03275aaa5212ee572a3a0d887cfc63a9b8b6239245c1bca75f7d04b"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.813265 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="dnsmasq-dns" containerID="cri-o://ab7d80585c73c2935a1546f42ec8127d8f07e4ebfcf89fc16e590bf9f313fdc3" gracePeriod=10 Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.099068 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.187162 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkn2z\" (UniqueName: \"kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z\") pod \"a0ffffa8-8199-4d59-927b-5563eda147fd\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.187274 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc\") pod \"a0ffffa8-8199-4d59-927b-5563eda147fd\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.187318 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config\") pod \"a0ffffa8-8199-4d59-927b-5563eda147fd\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.193721 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z" (OuterVolumeSpecName: "kube-api-access-qkn2z") pod "a0ffffa8-8199-4d59-927b-5563eda147fd" (UID: "a0ffffa8-8199-4d59-927b-5563eda147fd"). InnerVolumeSpecName "kube-api-access-qkn2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.232147 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0ffffa8-8199-4d59-927b-5563eda147fd" (UID: "a0ffffa8-8199-4d59-927b-5563eda147fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.232993 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config" (OuterVolumeSpecName: "config") pod "a0ffffa8-8199-4d59-927b-5563eda147fd" (UID: "a0ffffa8-8199-4d59-927b-5563eda147fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.294500 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.294553 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.294566 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkn2z\" (UniqueName: \"kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.584820 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.585042 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.661088 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.821307 4775 generic.go:334] "Generic (PLEG): container finished" podID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerID="eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9" exitCode=0 Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.821406 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerDied","Data":"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.824400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" event={"ID":"de956838-03d3-41d8-96d3-a85293eff207","Type":"ContainerStarted","Data":"be03cf8dd9b8b4d759b45f66de69e36302203158f35e88be1b7e33246324a38c"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.824501 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.826308 4775 generic.go:334] "Generic (PLEG): container finished" podID="c7196167-1cda-485b-9bec-36ab0e666568" containerID="ab7d80585c73c2935a1546f42ec8127d8f07e4ebfcf89fc16e590bf9f313fdc3" exitCode=0 Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.826335 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" event={"ID":"c7196167-1cda-485b-9bec-36ab0e666568","Type":"ContainerDied","Data":"ab7d80585c73c2935a1546f42ec8127d8f07e4ebfcf89fc16e590bf9f313fdc3"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.826424 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" event={"ID":"c7196167-1cda-485b-9bec-36ab0e666568","Type":"ContainerDied","Data":"23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.826441 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.828472 4775 generic.go:334] "Generic (PLEG): container finished" podID="a6b48db0-1768-4940-9e42-0362374c7358" containerID="dfcc40044b419ee03e79042d3f7fccf98f28c41e9f9431de67dcc1968ec91051" exitCode=0 Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.828520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerDied","Data":"dfcc40044b419ee03e79042d3f7fccf98f28c41e9f9431de67dcc1968ec91051"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.829651 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.830127 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" event={"ID":"a0ffffa8-8199-4d59-927b-5563eda147fd","Type":"ContainerDied","Data":"3b01344aca1f47a063297fbb9583a3d253a510b103b9f161a6d9fe9205de60d6"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.830178 4775 scope.go:117] "RemoveContainer" containerID="8be4620fe03275aaa5212ee572a3a0d887cfc63a9b8b6239245c1bca75f7d04b" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.830284 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.860316 4775 scope.go:117] "RemoveContainer" containerID="357b113e0ab8b0acfafd5e8a4b10ed58eb7061e0cf48c4acf628d4887c7b99da" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.868373 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" podStartSLOduration=2.868354615 podStartE2EDuration="2.868354615s" podCreationTimestamp="2026-01-27 11:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:18.863073402 +0000 UTC m=+958.004671179" watchObservedRunningTime="2026-01-27 11:36:18.868354615 +0000 UTC m=+958.009952392" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.923894 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.930514 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.968376 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.008020 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9dg2\" (UniqueName: \"kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2\") pod \"c7196167-1cda-485b-9bec-36ab0e666568\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.008294 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc\") pod \"c7196167-1cda-485b-9bec-36ab0e666568\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.008503 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config\") pod \"c7196167-1cda-485b-9bec-36ab0e666568\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.016004 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2" (OuterVolumeSpecName: "kube-api-access-d9dg2") pod "c7196167-1cda-485b-9bec-36ab0e666568" (UID: "c7196167-1cda-485b-9bec-36ab0e666568"). InnerVolumeSpecName "kube-api-access-d9dg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.063103 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config" (OuterVolumeSpecName: "config") pod "c7196167-1cda-485b-9bec-36ab0e666568" (UID: "c7196167-1cda-485b-9bec-36ab0e666568"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.063111 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7196167-1cda-485b-9bec-36ab0e666568" (UID: "c7196167-1cda-485b-9bec-36ab0e666568"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.110176 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.110201 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9dg2\" (UniqueName: \"kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.110212 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401144 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:19 crc kubenswrapper[4775]: E0127 11:36:19.401606 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="init" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401627 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="init" Jan 27 11:36:19 crc kubenswrapper[4775]: E0127 11:36:19.401649 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401658 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: E0127 11:36:19.401674 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401681 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: E0127 11:36:19.401703 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="init" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401710 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="init" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401902 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401922 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.403415 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.423705 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.517251 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjdj\" (UniqueName: \"kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.517625 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.517728 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.619019 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjdj\" (UniqueName: \"kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.619090 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.619159 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.619868 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.619865 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.644015 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjdj\" (UniqueName: \"kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.727723 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.757393 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" path="/var/lib/kubelet/pods/a0ffffa8-8199-4d59-927b-5563eda147fd/volumes" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.855917 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerStarted","Data":"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3"} Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.856302 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.858014 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6bb656eb-1eea-436d-acf3-6d8a548a97e5","Type":"ContainerStarted","Data":"27eec8f7676cb37dd7daff05707969bb0fd08a19b54b71f6d798ce512808ccdd"} Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.858042 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6bb656eb-1eea-436d-acf3-6d8a548a97e5","Type":"ContainerStarted","Data":"f95cf09a61434fbcb9c78b8a19d1d14e6adc70a9c67bd12d89c862c967841ad5"} Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.858853 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.863617 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerStarted","Data":"1f7c49ce837d6dbb266165ca63e898a0dc5b0872cf3564463905319d62ce7b1b"} Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.866464 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.879679 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" podStartSLOduration=3.879664415 podStartE2EDuration="3.879664415s" podCreationTimestamp="2026-01-27 11:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:19.877270549 +0000 UTC m=+959.018868326" watchObservedRunningTime="2026-01-27 11:36:19.879664415 +0000 UTC m=+959.021262192" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.899682 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.546321941 podStartE2EDuration="3.899664321s" podCreationTimestamp="2026-01-27 11:36:16 +0000 UTC" firstStartedPulling="2026-01-27 11:36:17.349511836 +0000 UTC m=+956.491109613" lastFinishedPulling="2026-01-27 11:36:18.702854216 +0000 UTC m=+957.844451993" observedRunningTime="2026-01-27 11:36:19.893218884 +0000 UTC m=+959.034816671" watchObservedRunningTime="2026-01-27 11:36:19.899664321 +0000 UTC m=+959.041262098" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.939117 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.956975 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.204832 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:20 crc kubenswrapper[4775]: W0127 11:36:20.207527 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6330ccb9_6a5a_42d6_8c0f_b3c395b867a0.slice/crio-99f3348700fb94d370f10d24c119a738256680dc0ee1f38c4d297c9772b690ab WatchSource:0}: Error finding container 99f3348700fb94d370f10d24c119a738256680dc0ee1f38c4d297c9772b690ab: Status 404 returned error can't find the container with id 99f3348700fb94d370f10d24c119a738256680dc0ee1f38c4d297c9772b690ab Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.505999 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.727967 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.872193 4775 generic.go:334] "Generic (PLEG): container finished" podID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerID="af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a" exitCode=0 Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.872244 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerDied","Data":"af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a"} Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.872293 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerStarted","Data":"99f3348700fb94d370f10d24c119a738256680dc0ee1f38c4d297c9772b690ab"} Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.874186 4775 generic.go:334] "Generic (PLEG): container finished" podID="a6b48db0-1768-4940-9e42-0362374c7358" containerID="1f7c49ce837d6dbb266165ca63e898a0dc5b0872cf3564463905319d62ce7b1b" exitCode=0 Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.874263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerDied","Data":"1f7c49ce837d6dbb266165ca63e898a0dc5b0872cf3564463905319d62ce7b1b"} Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.012632 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.012946 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.126175 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jz4kw"] Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.127109 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.129165 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.139648 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jz4kw"] Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.158098 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.179141 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.179406 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c4p9c" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="registry-server" containerID="cri-o://6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c" gracePeriod=2 Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.281226 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.281551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cv6w\" (UniqueName: \"kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.383644 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.383716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cv6w\" (UniqueName: \"kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.384668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.410281 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cv6w\" (UniqueName: \"kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.443802 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.557386 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.557819 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.617919 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.652548 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.792056 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content\") pod \"32c63ae2-f837-485f-9f74-0606288c3666\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.792555 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpgqk\" (UniqueName: \"kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk\") pod \"32c63ae2-f837-485f-9f74-0606288c3666\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.792601 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities\") pod \"32c63ae2-f837-485f-9f74-0606288c3666\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.792772 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7196167-1cda-485b-9bec-36ab0e666568" path="/var/lib/kubelet/pods/c7196167-1cda-485b-9bec-36ab0e666568/volumes" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.793798 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities" (OuterVolumeSpecName: "utilities") pod "32c63ae2-f837-485f-9f74-0606288c3666" (UID: "32c63ae2-f837-485f-9f74-0606288c3666"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.801708 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk" (OuterVolumeSpecName: "kube-api-access-vpgqk") pod "32c63ae2-f837-485f-9f74-0606288c3666" (UID: "32c63ae2-f837-485f-9f74-0606288c3666"). InnerVolumeSpecName "kube-api-access-vpgqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.845136 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32c63ae2-f837-485f-9f74-0606288c3666" (UID: "32c63ae2-f837-485f-9f74-0606288c3666"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.882915 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerStarted","Data":"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e"} Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.886014 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerStarted","Data":"980f0264e345cdfb3b6f590b6db854bc469b4b363ed97dbbe2f3f3ceba904a42"} Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.889438 4775 generic.go:334] "Generic (PLEG): container finished" podID="32c63ae2-f837-485f-9f74-0606288c3666" containerID="6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c" exitCode=0 Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.890070 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.891634 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerDied","Data":"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c"} Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.891694 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerDied","Data":"3c9369265622ba39ffc877111bca425ea61080e3b7bb1ee8ddc44e387299ce63"} Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.891718 4775 scope.go:117] "RemoveContainer" containerID="6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.893721 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.893741 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.893753 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpgqk\" (UniqueName: \"kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.930586 4775 scope.go:117] "RemoveContainer" containerID="ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.943884 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8s5p8" podStartSLOduration=3.510914934 podStartE2EDuration="5.943866428s" podCreationTimestamp="2026-01-27 11:36:16 +0000 UTC" firstStartedPulling="2026-01-27 11:36:18.830025079 +0000 UTC m=+957.971622856" lastFinishedPulling="2026-01-27 11:36:21.262976563 +0000 UTC m=+960.404574350" observedRunningTime="2026-01-27 11:36:21.920294314 +0000 UTC m=+961.061892091" watchObservedRunningTime="2026-01-27 11:36:21.943866428 +0000 UTC m=+961.085464205" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.944816 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.951317 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.955281 4775 scope.go:117] "RemoveContainer" containerID="8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.961034 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.981414 4775 scope.go:117] "RemoveContainer" containerID="6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c" Jan 27 11:36:21 crc kubenswrapper[4775]: E0127 11:36:21.982043 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c\": container with ID starting with 6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c not found: ID does not exist" containerID="6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.982077 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c"} err="failed to get container status \"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c\": rpc error: code = NotFound desc = could not find container \"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c\": container with ID starting with 6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c not found: ID does not exist" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.982096 4775 scope.go:117] "RemoveContainer" containerID="ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c" Jan 27 11:36:21 crc kubenswrapper[4775]: E0127 11:36:21.982479 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c\": container with ID starting with ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c not found: ID does not exist" containerID="ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.982619 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c"} err="failed to get container status \"ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c\": rpc error: code = NotFound desc = could not find container \"ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c\": container with ID starting with ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c not found: ID does not exist" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.982654 4775 scope.go:117] "RemoveContainer" containerID="8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5" Jan 27 11:36:21 crc kubenswrapper[4775]: E0127 11:36:21.983695 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5\": container with ID starting with 8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5 not found: ID does not exist" containerID="8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.983724 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5"} err="failed to get container status \"8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5\": rpc error: code = NotFound desc = could not find container \"8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5\": container with ID starting with 8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5 not found: ID does not exist" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.988007 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.004507 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jz4kw"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.018975 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.113822 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-z98pk"] Jan 27 11:36:22 crc kubenswrapper[4775]: E0127 11:36:22.114127 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="extract-content" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.114140 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="extract-content" Jan 27 11:36:22 crc kubenswrapper[4775]: E0127 11:36:22.114150 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="registry-server" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.114155 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="registry-server" Jan 27 11:36:22 crc kubenswrapper[4775]: E0127 11:36:22.114164 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="extract-utilities" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.114170 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="extract-utilities" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.114302 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="registry-server" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.114787 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.125399 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-z98pk"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.197089 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgcjk\" (UniqueName: \"kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.197394 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.232169 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2856-account-create-update-zgmqw"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.233341 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.235463 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.245915 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2856-account-create-update-zgmqw"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.306461 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.306827 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgcjk\" (UniqueName: \"kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.307477 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.327054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgcjk\" (UniqueName: \"kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.411214 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.411329 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.424796 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m5645"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.425789 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.432865 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m5645"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.434327 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.513061 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rfmh\" (UniqueName: \"kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.513414 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.513572 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.513701 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.515218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.532021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.545768 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.547111 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8d1f-account-create-update-gbh56"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.548322 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.567229 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.594080 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d1f-account-create-update-gbh56"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.615016 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rfmh\" (UniqueName: \"kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.615150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.616189 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.632335 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rfmh\" (UniqueName: \"kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.717105 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.717174 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66pxc\" (UniqueName: \"kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.770043 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.818379 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.818444 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66pxc\" (UniqueName: \"kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.819105 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.835127 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66pxc\" (UniqueName: \"kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.905461 4775 generic.go:334] "Generic (PLEG): container finished" podID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerID="96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e" exitCode=0 Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.905545 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerDied","Data":"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e"} Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.907819 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-z98pk"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.909572 4775 generic.go:334] "Generic (PLEG): container finished" podID="ba97a22f-b5dd-4289-bb3b-39578c05f231" containerID="8350f8998d5c2b4d38b2c37a8ef1d6f2931c0920b4400f0d9585d7221601d93d" exitCode=0 Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.909806 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jz4kw" event={"ID":"ba97a22f-b5dd-4289-bb3b-39578c05f231","Type":"ContainerDied","Data":"8350f8998d5c2b4d38b2c37a8ef1d6f2931c0920b4400f0d9585d7221601d93d"} Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.909833 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jz4kw" event={"ID":"ba97a22f-b5dd-4289-bb3b-39578c05f231","Type":"ContainerStarted","Data":"1c09ae16eb619392d5985f913efdb766e18ba5d62f5ec6ed6694b2a7ac8efb68"} Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.960893 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.018342 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2856-account-create-update-zgmqw"] Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.198661 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m5645"] Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.448453 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d1f-account-create-update-gbh56"] Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.764493 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32c63ae2-f837-485f-9f74-0606288c3666" path="/var/lib/kubelet/pods/32c63ae2-f837-485f-9f74-0606288c3666/volumes" Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.925224 4775 generic.go:334] "Generic (PLEG): container finished" podID="f53ed1d7-9aa1-49d4-8396-c3487e0465d6" containerID="7b4d6f31c9c98ba053d3d16dc4c80a54a02b6f5c6992d3e72b61e7cfc30b58ab" exitCode=0 Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.925372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z98pk" event={"ID":"f53ed1d7-9aa1-49d4-8396-c3487e0465d6","Type":"ContainerDied","Data":"7b4d6f31c9c98ba053d3d16dc4c80a54a02b6f5c6992d3e72b61e7cfc30b58ab"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.925425 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z98pk" event={"ID":"f53ed1d7-9aa1-49d4-8396-c3487e0465d6","Type":"ContainerStarted","Data":"8e088adb7960ae079bd4ae17a34932aaa8b20d59a3740ef813f88d0598ace2ad"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.926627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2856-account-create-update-zgmqw" event={"ID":"0bbde61d-aca8-4b36-8896-9c0db3e081be","Type":"ContainerStarted","Data":"25331384137e51f62cf5d50c569a969c7570079d48885c44122b0593afae0e9e"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.926661 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2856-account-create-update-zgmqw" event={"ID":"0bbde61d-aca8-4b36-8896-9c0db3e081be","Type":"ContainerStarted","Data":"2f4f48c0c35388742479c887fb4079df26a0d51b18d1878461a20933bd575635"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.932975 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d1f-account-create-update-gbh56" event={"ID":"24d04bb6-3007-42c5-9753-746a6eeb7d1c","Type":"ContainerStarted","Data":"a7104b478c78a88190582a427d9e420a454c991055e729bc5832a8bcf5f244d9"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.933007 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d1f-account-create-update-gbh56" event={"ID":"24d04bb6-3007-42c5-9753-746a6eeb7d1c","Type":"ContainerStarted","Data":"df80ece7ab0fd17c0d9c7e70ac47be4aea20f8011f1d38b81535074ba3cc4622"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.938982 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerStarted","Data":"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.940727 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5645" event={"ID":"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d","Type":"ContainerStarted","Data":"3b69c86674facf450b3f60f67ef713811fbc5e3c9c84c0321b56c4b870189985"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.940782 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5645" event={"ID":"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d","Type":"ContainerStarted","Data":"5923ba32098b5e082d1f4b2d5b1afb7d403212251b636753e0cd847905ffc64f"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.970074 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-2856-account-create-update-zgmqw" podStartSLOduration=1.970050723 podStartE2EDuration="1.970050723s" podCreationTimestamp="2026-01-27 11:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:23.964669016 +0000 UTC m=+963.106266813" watchObservedRunningTime="2026-01-27 11:36:23.970050723 +0000 UTC m=+963.111648520" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.296639 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.347944 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cv6w\" (UniqueName: \"kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w\") pod \"ba97a22f-b5dd-4289-bb3b-39578c05f231\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.348117 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts\") pod \"ba97a22f-b5dd-4289-bb3b-39578c05f231\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.348711 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba97a22f-b5dd-4289-bb3b-39578c05f231" (UID: "ba97a22f-b5dd-4289-bb3b-39578c05f231"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.359709 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w" (OuterVolumeSpecName: "kube-api-access-4cv6w") pod "ba97a22f-b5dd-4289-bb3b-39578c05f231" (UID: "ba97a22f-b5dd-4289-bb3b-39578c05f231"). InnerVolumeSpecName "kube-api-access-4cv6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.449495 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cv6w\" (UniqueName: \"kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.449529 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.675462 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.784205 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.784451 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="dnsmasq-dns" containerID="cri-o://be03cf8dd9b8b4d759b45f66de69e36302203158f35e88be1b7e33246324a38c" gracePeriod=10 Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.786193 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.811457 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:36:24 crc kubenswrapper[4775]: E0127 11:36:24.811777 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba97a22f-b5dd-4289-bb3b-39578c05f231" containerName="mariadb-account-create-update" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.811793 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba97a22f-b5dd-4289-bb3b-39578c05f231" containerName="mariadb-account-create-update" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.811942 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba97a22f-b5dd-4289-bb3b-39578c05f231" containerName="mariadb-account-create-update" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.812728 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.834782 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.949872 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jz4kw" event={"ID":"ba97a22f-b5dd-4289-bb3b-39578c05f231","Type":"ContainerDied","Data":"1c09ae16eb619392d5985f913efdb766e18ba5d62f5ec6ed6694b2a7ac8efb68"} Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.949914 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.949908 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c09ae16eb619392d5985f913efdb766e18ba5d62f5ec6ed6694b2a7ac8efb68" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.959594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.959653 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.959681 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.959727 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.959789 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g82sd\" (UniqueName: \"kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.976721 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8d1f-account-create-update-gbh56" podStartSLOduration=2.976702854 podStartE2EDuration="2.976702854s" podCreationTimestamp="2026-01-27 11:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:24.970071973 +0000 UTC m=+964.111669750" watchObservedRunningTime="2026-01-27 11:36:24.976702854 +0000 UTC m=+964.118300631" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.985667 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t2tfh" podStartSLOduration=3.542734202 podStartE2EDuration="5.985651168s" podCreationTimestamp="2026-01-27 11:36:19 +0000 UTC" firstStartedPulling="2026-01-27 11:36:20.87328792 +0000 UTC m=+960.014885697" lastFinishedPulling="2026-01-27 11:36:23.316204886 +0000 UTC m=+962.457802663" observedRunningTime="2026-01-27 11:36:24.985186706 +0000 UTC m=+964.126784483" watchObservedRunningTime="2026-01-27 11:36:24.985651168 +0000 UTC m=+964.127248945" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.004547 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-m5645" podStartSLOduration=3.004529794 podStartE2EDuration="3.004529794s" podCreationTimestamp="2026-01-27 11:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:25.002827758 +0000 UTC m=+964.144425545" watchObservedRunningTime="2026-01-27 11:36:25.004529794 +0000 UTC m=+964.146127571" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.062858 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g82sd\" (UniqueName: \"kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.062920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.062967 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.063009 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.063079 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.064132 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.067741 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.067922 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.068317 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.085022 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g82sd\" (UniqueName: \"kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.132013 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.313251 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.368917 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgcjk\" (UniqueName: \"kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk\") pod \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.369347 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts\") pod \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.370209 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f53ed1d7-9aa1-49d4-8396-c3487e0465d6" (UID: "f53ed1d7-9aa1-49d4-8396-c3487e0465d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.373990 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk" (OuterVolumeSpecName: "kube-api-access-cgcjk") pod "f53ed1d7-9aa1-49d4-8396-c3487e0465d6" (UID: "f53ed1d7-9aa1-49d4-8396-c3487e0465d6"). InnerVolumeSpecName "kube-api-access-cgcjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.471331 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgcjk\" (UniqueName: \"kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.471358 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:25 crc kubenswrapper[4775]: W0127 11:36:25.685144 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc24ee1fa_0d6a_4ca1_b298_d876f473f8f8.slice/crio-28976e350fb8ecd8fa41a546d6bc48a308f3c35b6b458e7b2f0ad3f0838c3094 WatchSource:0}: Error finding container 28976e350fb8ecd8fa41a546d6bc48a308f3c35b6b458e7b2f0ad3f0838c3094: Status 404 returned error can't find the container with id 28976e350fb8ecd8fa41a546d6bc48a308f3c35b6b458e7b2f0ad3f0838c3094 Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.686659 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.781244 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.781504 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hp57" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="registry-server" containerID="cri-o://c77bc3df0ef278fa6252111f5fae5b862f83e4845c5181bed73e9f84cf00a7b4" gracePeriod=2 Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.960837 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.960864 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z98pk" event={"ID":"f53ed1d7-9aa1-49d4-8396-c3487e0465d6","Type":"ContainerDied","Data":"8e088adb7960ae079bd4ae17a34932aaa8b20d59a3740ef813f88d0598ace2ad"} Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.961493 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e088adb7960ae079bd4ae17a34932aaa8b20d59a3740ef813f88d0598ace2ad" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.966616 4775 generic.go:334] "Generic (PLEG): container finished" podID="de956838-03d3-41d8-96d3-a85293eff207" containerID="be03cf8dd9b8b4d759b45f66de69e36302203158f35e88be1b7e33246324a38c" exitCode=0 Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.966687 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" event={"ID":"de956838-03d3-41d8-96d3-a85293eff207","Type":"ContainerDied","Data":"be03cf8dd9b8b4d759b45f66de69e36302203158f35e88be1b7e33246324a38c"} Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.969700 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" event={"ID":"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8","Type":"ContainerStarted","Data":"28976e350fb8ecd8fa41a546d6bc48a308f3c35b6b458e7b2f0ad3f0838c3094"} Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.973459 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 27 11:36:25 crc kubenswrapper[4775]: E0127 11:36:25.973899 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53ed1d7-9aa1-49d4-8396-c3487e0465d6" containerName="mariadb-database-create" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.973924 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53ed1d7-9aa1-49d4-8396-c3487e0465d6" containerName="mariadb-database-create" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.974124 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53ed1d7-9aa1-49d4-8396-c3487e0465d6" containerName="mariadb-database-create" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.984802 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.987124 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.987204 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-xl5vv" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.989021 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.990930 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.998509 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.081802 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-lock\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.081860 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.081886 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-cache\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.082087 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.082222 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.082297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr99d\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-kube-api-access-rr99d\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184259 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-lock\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184329 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184353 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-cache\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184384 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184413 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184440 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr99d\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-kube-api-access-rr99d\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.184799 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.184829 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.184888 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift podName:b2f2b115-8dea-4dfa-a28e-5322f8fb8274 nodeName:}" failed. No retries permitted until 2026-01-27 11:36:26.684868769 +0000 UTC m=+965.826466536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift") pod "swift-storage-0" (UID: "b2f2b115-8dea-4dfa-a28e-5322f8fb8274") : configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184941 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-cache\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.185240 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-lock\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.191313 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.202421 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr99d\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-kube-api-access-rr99d\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.211146 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.348833 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.106:5353: connect: connection refused" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.505187 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7bdl6"] Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.508601 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.514212 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.514405 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.514593 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.518045 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7bdl6"] Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.593677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.593947 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.594013 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.594136 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th5jx\" (UniqueName: \"kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.594153 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.594297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.594316 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.677121 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695724 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695788 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695833 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th5jx\" (UniqueName: \"kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695854 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695942 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695965 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.696029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.696668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.696780 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.696801 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.696841 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift podName:b2f2b115-8dea-4dfa-a28e-5322f8fb8274 nodeName:}" failed. No retries permitted until 2026-01-27 11:36:27.69682602 +0000 UTC m=+966.838423797 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift") pod "swift-storage-0" (UID: "b2f2b115-8dea-4dfa-a28e-5322f8fb8274") : configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.697777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.698065 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.699890 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.702226 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.703801 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.721286 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th5jx\" (UniqueName: \"kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.832957 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.981952 4775 generic.go:334] "Generic (PLEG): container finished" podID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerID="c77bc3df0ef278fa6252111f5fae5b862f83e4845c5181bed73e9f84cf00a7b4" exitCode=0 Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.982033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerDied","Data":"c77bc3df0ef278fa6252111f5fae5b862f83e4845c5181bed73e9f84cf00a7b4"} Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.987068 4775 generic.go:334] "Generic (PLEG): container finished" podID="62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" containerID="3b69c86674facf450b3f60f67ef713811fbc5e3c9c84c0321b56c4b870189985" exitCode=0 Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.987099 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5645" event={"ID":"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d","Type":"ContainerDied","Data":"3b69c86674facf450b3f60f67ef713811fbc5e3c9c84c0321b56c4b870189985"} Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.143223 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.144046 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.208011 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.348116 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7bdl6"] Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.491537 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.611531 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config\") pod \"de956838-03d3-41d8-96d3-a85293eff207\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.611706 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztkr9\" (UniqueName: \"kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9\") pod \"de956838-03d3-41d8-96d3-a85293eff207\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.611778 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc\") pod \"de956838-03d3-41d8-96d3-a85293eff207\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.611839 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb\") pod \"de956838-03d3-41d8-96d3-a85293eff207\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.616795 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9" (OuterVolumeSpecName: "kube-api-access-ztkr9") pod "de956838-03d3-41d8-96d3-a85293eff207" (UID: "de956838-03d3-41d8-96d3-a85293eff207"). InnerVolumeSpecName "kube-api-access-ztkr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.655392 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config" (OuterVolumeSpecName: "config") pod "de956838-03d3-41d8-96d3-a85293eff207" (UID: "de956838-03d3-41d8-96d3-a85293eff207"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.657802 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "de956838-03d3-41d8-96d3-a85293eff207" (UID: "de956838-03d3-41d8-96d3-a85293eff207"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.666128 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de956838-03d3-41d8-96d3-a85293eff207" (UID: "de956838-03d3-41d8-96d3-a85293eff207"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.669891 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.712910 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities\") pod \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713085 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjzm\" (UniqueName: \"kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm\") pod \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713154 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content\") pod \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713337 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713527 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713539 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713548 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztkr9\" (UniqueName: \"kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713558 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713573 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities" (OuterVolumeSpecName: "utilities") pod "f7ac68bf-cd99-4022-af50-a73ddc6181b0" (UID: "f7ac68bf-cd99-4022-af50-a73ddc6181b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.713648 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.713658 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.713692 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift podName:b2f2b115-8dea-4dfa-a28e-5322f8fb8274 nodeName:}" failed. No retries permitted until 2026-01-27 11:36:29.713680721 +0000 UTC m=+968.855278498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift") pod "swift-storage-0" (UID: "b2f2b115-8dea-4dfa-a28e-5322f8fb8274") : configmap "swift-ring-files" not found Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.734353 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm" (OuterVolumeSpecName: "kube-api-access-9jjzm") pod "f7ac68bf-cd99-4022-af50-a73ddc6181b0" (UID: "f7ac68bf-cd99-4022-af50-a73ddc6181b0"). InnerVolumeSpecName "kube-api-access-9jjzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.761612 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7ac68bf-cd99-4022-af50-a73ddc6181b0" (UID: "f7ac68bf-cd99-4022-af50-a73ddc6181b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808139 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-599fs"] Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.808550 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="extract-utilities" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808573 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="extract-utilities" Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.808593 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="extract-content" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808602 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="extract-content" Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.808625 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="dnsmasq-dns" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808634 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="dnsmasq-dns" Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.808654 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="init" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808663 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="init" Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.808679 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="registry-server" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808687 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="registry-server" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808914 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="dnsmasq-dns" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808927 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="registry-server" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.809934 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-599fs" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.815264 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.815287 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.815300 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjzm\" (UniqueName: \"kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.817281 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-599fs"] Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.916594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlxfh\" (UniqueName: \"kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.916734 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.931817 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9763-account-create-update-dms9b"] Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.933074 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.935199 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.959537 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9763-account-create-update-dms9b"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.001868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" event={"ID":"de956838-03d3-41d8-96d3-a85293eff207","Type":"ContainerDied","Data":"fd5b11d9815172e4b8d84472d53d5f5c5a67656114e7d95c656d684b7f601224"} Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.001922 4775 scope.go:117] "RemoveContainer" containerID="be03cf8dd9b8b4d759b45f66de69e36302203158f35e88be1b7e33246324a38c" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.001929 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.003862 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7bdl6" event={"ID":"aa44a018-6958-4bee-895d-e7ec3966be8d","Type":"ContainerStarted","Data":"40d78acc3513c42656eeabd0301aca54c4b90d9da6dc67b6891b3be0547d67c8"} Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.010785 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerDied","Data":"d68aa08b8c10efd267dbb532a84a73914540135473560968b1351b3eea784ca0"} Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.010809 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.019112 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.019156 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.019233 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlxfh\" (UniqueName: \"kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.019301 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccp8\" (UniqueName: \"kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.021522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.022107 4775 generic.go:334] "Generic (PLEG): container finished" podID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerID="b5dc76210b8840ce4aa3ed6531d8e2c91e46aaffef6ddac900a9922372f2a92b" exitCode=0 Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.022469 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" event={"ID":"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8","Type":"ContainerDied","Data":"b5dc76210b8840ce4aa3ed6531d8e2c91e46aaffef6ddac900a9922372f2a92b"} Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.040664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlxfh\" (UniqueName: \"kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.057165 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.078300 4775 scope.go:117] "RemoveContainer" containerID="2e81b0c9e712aea129261264566fcfb84e7c96b384aef515c9ba092ed6df8a8f" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.078442 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.094872 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.106584 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.117148 4775 scope.go:117] "RemoveContainer" containerID="c77bc3df0ef278fa6252111f5fae5b862f83e4845c5181bed73e9f84cf00a7b4" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.117580 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.121480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bccp8\" (UniqueName: \"kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.121573 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.122577 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.125881 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-599fs" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.143193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bccp8\" (UniqueName: \"kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.184706 4775 scope.go:117] "RemoveContainer" containerID="862553ece09ec7abc1ec1a84f1cafbd9dd0b4ae450db1c4c095ba98bfbf00ead" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.212596 4775 scope.go:117] "RemoveContainer" containerID="a7078e3f79ebaf3e143c08c9b5d9ba3454399bc72621c179ec98e87d8ca953ac" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.256676 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.490287 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5645" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.527552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts\") pod \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.527818 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rfmh\" (UniqueName: \"kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh\") pod \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.529444 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" (UID: "62f5bc59-5fa8-42f4-bc7b-85827a01cc9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.539530 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh" (OuterVolumeSpecName: "kube-api-access-5rfmh") pod "62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" (UID: "62f5bc59-5fa8-42f4-bc7b-85827a01cc9d"). InnerVolumeSpecName "kube-api-access-5rfmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.629839 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.629884 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rfmh\" (UniqueName: \"kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.649147 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-599fs"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.777664 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9763-account-create-update-dms9b"] Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.032622 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" event={"ID":"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8","Type":"ContainerStarted","Data":"5e752a3391827672fa37b60e71b5a6f3c1262d98795c1a40cb7662f381943f34"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.033707 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.036275 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-599fs" event={"ID":"c3cd1d9e-b735-4f90-b92a-00353e576e10","Type":"ContainerStarted","Data":"b726600d4c126579c1604f5195dde261fec3e367b813eba5f4b69473ff9e521c"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.036309 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-599fs" event={"ID":"c3cd1d9e-b735-4f90-b92a-00353e576e10","Type":"ContainerStarted","Data":"3b5ddd612ab93297e3e23fb56033a168ed9825de73d3fd9e685554b0fa0f4c04"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.039605 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9763-account-create-update-dms9b" event={"ID":"f577e755-a863-4fea-9288-6cd30168b405","Type":"ContainerStarted","Data":"1b501489d56c612c1213704c15f0b24ba5a096453c8a67466274eb0e4a0ced9d"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.039637 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9763-account-create-update-dms9b" event={"ID":"f577e755-a863-4fea-9288-6cd30168b405","Type":"ContainerStarted","Data":"01a9061ed3fa1746263b0d1d14017828bc7e0337d318aa6d508766ae75ad8327"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.041679 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5645" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.043067 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5645" event={"ID":"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d","Type":"ContainerDied","Data":"5923ba32098b5e082d1f4b2d5b1afb7d403212251b636753e0cd847905ffc64f"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.043179 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5923ba32098b5e082d1f4b2d5b1afb7d403212251b636753e0cd847905ffc64f" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.059749 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podStartSLOduration=5.059729011 podStartE2EDuration="5.059729011s" podCreationTimestamp="2026-01-27 11:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:29.054016276 +0000 UTC m=+968.195614083" watchObservedRunningTime="2026-01-27 11:36:29.059729011 +0000 UTC m=+968.201326788" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.080026 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-599fs" podStartSLOduration=2.080009076 podStartE2EDuration="2.080009076s" podCreationTimestamp="2026-01-27 11:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:29.069614441 +0000 UTC m=+968.211212218" watchObservedRunningTime="2026-01-27 11:36:29.080009076 +0000 UTC m=+968.221606853" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.088869 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-9763-account-create-update-dms9b" podStartSLOduration=2.088851587 podStartE2EDuration="2.088851587s" podCreationTimestamp="2026-01-27 11:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:29.086552634 +0000 UTC m=+968.228150411" watchObservedRunningTime="2026-01-27 11:36:29.088851587 +0000 UTC m=+968.230449354" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.521080 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.521191 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.521249 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.522214 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.522292 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310" gracePeriod=600 Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.659359 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jz4kw"] Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.669883 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jz4kw"] Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.728123 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.728255 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.747301 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:29 crc kubenswrapper[4775]: E0127 11:36:29.747498 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 11:36:29 crc kubenswrapper[4775]: E0127 11:36:29.747512 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 11:36:29 crc kubenswrapper[4775]: E0127 11:36:29.747550 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift podName:b2f2b115-8dea-4dfa-a28e-5322f8fb8274 nodeName:}" failed. No retries permitted until 2026-01-27 11:36:33.747537465 +0000 UTC m=+972.889135242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift") pod "swift-storage-0" (UID: "b2f2b115-8dea-4dfa-a28e-5322f8fb8274") : configmap "swift-ring-files" not found Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.786650 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba97a22f-b5dd-4289-bb3b-39578c05f231" path="/var/lib/kubelet/pods/ba97a22f-b5dd-4289-bb3b-39578c05f231/volumes" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.787160 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de956838-03d3-41d8-96d3-a85293eff207" path="/var/lib/kubelet/pods/de956838-03d3-41d8-96d3-a85293eff207/volumes" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.787685 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" path="/var/lib/kubelet/pods/f7ac68bf-cd99-4022-af50-a73ddc6181b0/volumes" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.788849 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vvxg4"] Jan 27 11:36:29 crc kubenswrapper[4775]: E0127 11:36:29.789111 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" containerName="mariadb-database-create" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.789126 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" containerName="mariadb-database-create" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.789288 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" containerName="mariadb-database-create" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.789783 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vvxg4"] Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.789860 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.791318 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.849011 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.849202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmbjq\" (UniqueName: \"kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.951212 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmbjq\" (UniqueName: \"kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.951417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.952200 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.974530 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmbjq\" (UniqueName: \"kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:30 crc kubenswrapper[4775]: I0127 11:36:30.114023 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:30 crc kubenswrapper[4775]: I0127 11:36:30.199546 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:30 crc kubenswrapper[4775]: I0127 11:36:30.796880 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t2tfh" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="registry-server" probeResult="failure" output=< Jan 27 11:36:30 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:36:30 crc kubenswrapper[4775]: > Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.064302 4775 generic.go:334] "Generic (PLEG): container finished" podID="c3cd1d9e-b735-4f90-b92a-00353e576e10" containerID="b726600d4c126579c1604f5195dde261fec3e367b813eba5f4b69473ff9e521c" exitCode=0 Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.064495 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-599fs" event={"ID":"c3cd1d9e-b735-4f90-b92a-00353e576e10","Type":"ContainerDied","Data":"b726600d4c126579c1604f5195dde261fec3e367b813eba5f4b69473ff9e521c"} Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.071103 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310" exitCode=0 Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.071311 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310"} Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.071375 4775 scope.go:117] "RemoveContainer" containerID="2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad" Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.074409 4775 generic.go:334] "Generic (PLEG): container finished" podID="0bbde61d-aca8-4b36-8896-9c0db3e081be" containerID="25331384137e51f62cf5d50c569a969c7570079d48885c44122b0593afae0e9e" exitCode=0 Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.074609 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2856-account-create-update-zgmqw" event={"ID":"0bbde61d-aca8-4b36-8896-9c0db3e081be","Type":"ContainerDied","Data":"25331384137e51f62cf5d50c569a969c7570079d48885c44122b0593afae0e9e"} Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.075121 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8s5p8" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="registry-server" containerID="cri-o://980f0264e345cdfb3b6f590b6db854bc469b4b363ed97dbbe2f3f3ceba904a42" gracePeriod=2 Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.806922 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.114964 4775 generic.go:334] "Generic (PLEG): container finished" podID="a6b48db0-1768-4940-9e42-0362374c7358" containerID="980f0264e345cdfb3b6f590b6db854bc469b4b363ed97dbbe2f3f3ceba904a42" exitCode=0 Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.115166 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerDied","Data":"980f0264e345cdfb3b6f590b6db854bc469b4b363ed97dbbe2f3f3ceba904a42"} Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.117830 4775 generic.go:334] "Generic (PLEG): container finished" podID="f577e755-a863-4fea-9288-6cd30168b405" containerID="1b501489d56c612c1213704c15f0b24ba5a096453c8a67466274eb0e4a0ced9d" exitCode=0 Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.117876 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9763-account-create-update-dms9b" event={"ID":"f577e755-a863-4fea-9288-6cd30168b405","Type":"ContainerDied","Data":"1b501489d56c612c1213704c15f0b24ba5a096453c8a67466274eb0e4a0ced9d"} Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.120499 4775 generic.go:334] "Generic (PLEG): container finished" podID="24d04bb6-3007-42c5-9753-746a6eeb7d1c" containerID="a7104b478c78a88190582a427d9e420a454c991055e729bc5832a8bcf5f244d9" exitCode=0 Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.120678 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d1f-account-create-update-gbh56" event={"ID":"24d04bb6-3007-42c5-9753-746a6eeb7d1c","Type":"ContainerDied","Data":"a7104b478c78a88190582a427d9e420a454c991055e729bc5832a8bcf5f244d9"} Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.185886 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.291167 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7c9q\" (UniqueName: \"kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q\") pod \"a6b48db0-1768-4940-9e42-0362374c7358\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.291749 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities\") pod \"a6b48db0-1768-4940-9e42-0362374c7358\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.291801 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content\") pod \"a6b48db0-1768-4940-9e42-0362374c7358\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.292783 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities" (OuterVolumeSpecName: "utilities") pod "a6b48db0-1768-4940-9e42-0362374c7358" (UID: "a6b48db0-1768-4940-9e42-0362374c7358"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.299221 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q" (OuterVolumeSpecName: "kube-api-access-d7c9q") pod "a6b48db0-1768-4940-9e42-0362374c7358" (UID: "a6b48db0-1768-4940-9e42-0362374c7358"). InnerVolumeSpecName "kube-api-access-d7c9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.393673 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.393703 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7c9q\" (UniqueName: \"kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.403128 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vvxg4"] Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.552980 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.581591 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-599fs" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.585151 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6b48db0-1768-4940-9e42-0362374c7358" (UID: "a6b48db0-1768-4940-9e42-0362374c7358"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.602043 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts\") pod \"0bbde61d-aca8-4b36-8896-9c0db3e081be\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.602391 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx\") pod \"0bbde61d-aca8-4b36-8896-9c0db3e081be\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.602867 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.604247 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bbde61d-aca8-4b36-8896-9c0db3e081be" (UID: "0bbde61d-aca8-4b36-8896-9c0db3e081be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.608696 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx" (OuterVolumeSpecName: "kube-api-access-jrcgx") pod "0bbde61d-aca8-4b36-8896-9c0db3e081be" (UID: "0bbde61d-aca8-4b36-8896-9c0db3e081be"). InnerVolumeSpecName "kube-api-access-jrcgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.704498 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlxfh\" (UniqueName: \"kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh\") pod \"c3cd1d9e-b735-4f90-b92a-00353e576e10\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.704558 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts\") pod \"c3cd1d9e-b735-4f90-b92a-00353e576e10\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.705002 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3cd1d9e-b735-4f90-b92a-00353e576e10" (UID: "c3cd1d9e-b735-4f90-b92a-00353e576e10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.705141 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.705159 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.705171 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.707051 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh" (OuterVolumeSpecName: "kube-api-access-wlxfh") pod "c3cd1d9e-b735-4f90-b92a-00353e576e10" (UID: "c3cd1d9e-b735-4f90-b92a-00353e576e10"). InnerVolumeSpecName "kube-api-access-wlxfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.807390 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlxfh\" (UniqueName: \"kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.128765 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.130163 4775 generic.go:334] "Generic (PLEG): container finished" podID="f208e1de-fc0e-4deb-a093-d27604b3931f" containerID="f260a904e6d20da11c12e2ef276cb0dd004088b3878643538e823bf35507b886" exitCode=0 Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.130223 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vvxg4" event={"ID":"f208e1de-fc0e-4deb-a093-d27604b3931f","Type":"ContainerDied","Data":"f260a904e6d20da11c12e2ef276cb0dd004088b3878643538e823bf35507b886"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.130249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vvxg4" event={"ID":"f208e1de-fc0e-4deb-a093-d27604b3931f","Type":"ContainerStarted","Data":"24da8498cc84cff4f9ef441ab42afb304d113914ccb69880de7715e226fe3433"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.132202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerDied","Data":"f2f98601e1fc4d2ec97f9e0c70c2dbd57bb16d6fdfa6d9ac4a20a475acb21242"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.132223 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.132234 4775 scope.go:117] "RemoveContainer" containerID="980f0264e345cdfb3b6f590b6db854bc469b4b363ed97dbbe2f3f3ceba904a42" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.136721 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.136694 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2856-account-create-update-zgmqw" event={"ID":"0bbde61d-aca8-4b36-8896-9c0db3e081be","Type":"ContainerDied","Data":"2f4f48c0c35388742479c887fb4079df26a0d51b18d1878461a20933bd575635"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.136910 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4f48c0c35388742479c887fb4079df26a0d51b18d1878461a20933bd575635" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.146231 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-599fs" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.146233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-599fs" event={"ID":"c3cd1d9e-b735-4f90-b92a-00353e576e10","Type":"ContainerDied","Data":"3b5ddd612ab93297e3e23fb56033a168ed9825de73d3fd9e685554b0fa0f4c04"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.146326 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b5ddd612ab93297e3e23fb56033a168ed9825de73d3fd9e685554b0fa0f4c04" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.148519 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7bdl6" event={"ID":"aa44a018-6958-4bee-895d-e7ec3966be8d","Type":"ContainerStarted","Data":"510f5ff2f8d44620fdee51bdb0166c2c4b4f86e61d274047b5401fdf6da98261"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.159830 4775 scope.go:117] "RemoveContainer" containerID="1f7c49ce837d6dbb266165ca63e898a0dc5b0872cf3564463905319d62ce7b1b" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.192024 4775 scope.go:117] "RemoveContainer" containerID="dfcc40044b419ee03e79042d3f7fccf98f28c41e9f9431de67dcc1968ec91051" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.196427 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-7bdl6" podStartSLOduration=2.597667803 podStartE2EDuration="7.192360134s" podCreationTimestamp="2026-01-27 11:36:26 +0000 UTC" firstStartedPulling="2026-01-27 11:36:27.366364256 +0000 UTC m=+966.507962033" lastFinishedPulling="2026-01-27 11:36:31.961056587 +0000 UTC m=+971.102654364" observedRunningTime="2026-01-27 11:36:33.186619347 +0000 UTC m=+972.328217124" watchObservedRunningTime="2026-01-27 11:36:33.192360134 +0000 UTC m=+972.333957911" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.214498 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.228722 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.535847 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.543639 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.625993 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts\") pod \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.626033 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts\") pod \"f577e755-a863-4fea-9288-6cd30168b405\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.626056 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66pxc\" (UniqueName: \"kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc\") pod \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.626075 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bccp8\" (UniqueName: \"kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8\") pod \"f577e755-a863-4fea-9288-6cd30168b405\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.626511 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24d04bb6-3007-42c5-9753-746a6eeb7d1c" (UID: "24d04bb6-3007-42c5-9753-746a6eeb7d1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.626658 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f577e755-a863-4fea-9288-6cd30168b405" (UID: "f577e755-a863-4fea-9288-6cd30168b405"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.635163 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8" (OuterVolumeSpecName: "kube-api-access-bccp8") pod "f577e755-a863-4fea-9288-6cd30168b405" (UID: "f577e755-a863-4fea-9288-6cd30168b405"). InnerVolumeSpecName "kube-api-access-bccp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.635360 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc" (OuterVolumeSpecName: "kube-api-access-66pxc") pod "24d04bb6-3007-42c5-9753-746a6eeb7d1c" (UID: "24d04bb6-3007-42c5-9753-746a6eeb7d1c"). InnerVolumeSpecName "kube-api-access-66pxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.728228 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.728263 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.728273 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66pxc\" (UniqueName: \"kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.728283 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bccp8\" (UniqueName: \"kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.755213 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b48db0-1768-4940-9e42-0362374c7358" path="/var/lib/kubelet/pods/a6b48db0-1768-4940-9e42-0362374c7358/volumes" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.829396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:33 crc kubenswrapper[4775]: E0127 11:36:33.829645 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 11:36:33 crc kubenswrapper[4775]: E0127 11:36:33.829681 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 11:36:33 crc kubenswrapper[4775]: E0127 11:36:33.829758 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift podName:b2f2b115-8dea-4dfa-a28e-5322f8fb8274 nodeName:}" failed. No retries permitted until 2026-01-27 11:36:41.8297336 +0000 UTC m=+980.971331377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift") pod "swift-storage-0" (UID: "b2f2b115-8dea-4dfa-a28e-5322f8fb8274") : configmap "swift-ring-files" not found Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.158858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9763-account-create-update-dms9b" event={"ID":"f577e755-a863-4fea-9288-6cd30168b405","Type":"ContainerDied","Data":"01a9061ed3fa1746263b0d1d14017828bc7e0337d318aa6d508766ae75ad8327"} Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.158917 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01a9061ed3fa1746263b0d1d14017828bc7e0337d318aa6d508766ae75ad8327" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.159071 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.162222 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d1f-account-create-update-gbh56" event={"ID":"24d04bb6-3007-42c5-9753-746a6eeb7d1c","Type":"ContainerDied","Data":"df80ece7ab0fd17c0d9c7e70ac47be4aea20f8011f1d38b81535074ba3cc4622"} Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.162264 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df80ece7ab0fd17c0d9c7e70ac47be4aea20f8011f1d38b81535074ba3cc4622" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.162312 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.525819 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.641070 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmbjq\" (UniqueName: \"kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq\") pod \"f208e1de-fc0e-4deb-a093-d27604b3931f\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.641354 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts\") pod \"f208e1de-fc0e-4deb-a093-d27604b3931f\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.642197 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f208e1de-fc0e-4deb-a093-d27604b3931f" (UID: "f208e1de-fc0e-4deb-a093-d27604b3931f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.646166 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq" (OuterVolumeSpecName: "kube-api-access-cmbjq") pod "f208e1de-fc0e-4deb-a093-d27604b3931f" (UID: "f208e1de-fc0e-4deb-a093-d27604b3931f"). InnerVolumeSpecName "kube-api-access-cmbjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.745129 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.746129 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmbjq\" (UniqueName: \"kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.133802 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.176801 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vvxg4" event={"ID":"f208e1de-fc0e-4deb-a093-d27604b3931f","Type":"ContainerDied","Data":"24da8498cc84cff4f9ef441ab42afb304d113914ccb69880de7715e226fe3433"} Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.176904 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24da8498cc84cff4f9ef441ab42afb304d113914ccb69880de7715e226fe3433" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.176834 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.201520 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.201810 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="dnsmasq-dns" containerID="cri-o://8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3" gracePeriod=10 Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.765084 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.870259 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc\") pod \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.870385 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb\") pod \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.870584 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb\") pod \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.870634 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config\") pod \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.870687 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5q2x\" (UniqueName: \"kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x\") pod \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.903252 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x" (OuterVolumeSpecName: "kube-api-access-m5q2x") pod "31d3ee22-9b3b-46ac-b896-ba5c521e1753" (UID: "31d3ee22-9b3b-46ac-b896-ba5c521e1753"). InnerVolumeSpecName "kube-api-access-m5q2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.911491 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config" (OuterVolumeSpecName: "config") pod "31d3ee22-9b3b-46ac-b896-ba5c521e1753" (UID: "31d3ee22-9b3b-46ac-b896-ba5c521e1753"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.913981 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31d3ee22-9b3b-46ac-b896-ba5c521e1753" (UID: "31d3ee22-9b3b-46ac-b896-ba5c521e1753"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.922672 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31d3ee22-9b3b-46ac-b896-ba5c521e1753" (UID: "31d3ee22-9b3b-46ac-b896-ba5c521e1753"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.923745 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31d3ee22-9b3b-46ac-b896-ba5c521e1753" (UID: "31d3ee22-9b3b-46ac-b896-ba5c521e1753"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.974416 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.974444 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.974472 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.974481 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.974489 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5q2x\" (UniqueName: \"kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.185538 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.185542 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerDied","Data":"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3"} Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.185623 4775 scope.go:117] "RemoveContainer" containerID="8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.187628 4775 generic.go:334] "Generic (PLEG): container finished" podID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerID="8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3" exitCode=0 Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.187742 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerDied","Data":"b21b311b130da3440a7a2e7074ea7f07554bbb6a824125778029cf4c67436a28"} Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.208020 4775 scope.go:117] "RemoveContainer" containerID="eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.232012 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.237731 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.239050 4775 scope.go:117] "RemoveContainer" containerID="8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3" Jan 27 11:36:36 crc kubenswrapper[4775]: E0127 11:36:36.240532 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3\": container with ID starting with 8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3 not found: ID does not exist" containerID="8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.240564 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3"} err="failed to get container status \"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3\": rpc error: code = NotFound desc = could not find container \"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3\": container with ID starting with 8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3 not found: ID does not exist" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.240588 4775 scope.go:117] "RemoveContainer" containerID="eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9" Jan 27 11:36:36 crc kubenswrapper[4775]: E0127 11:36:36.240938 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9\": container with ID starting with eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9 not found: ID does not exist" containerID="eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.240958 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9"} err="failed to get container status \"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9\": rpc error: code = NotFound desc = could not find container \"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9\": container with ID starting with eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9 not found: ID does not exist" Jan 27 11:36:37 crc kubenswrapper[4775]: I0127 11:36:37.201945 4775 generic.go:334] "Generic (PLEG): container finished" podID="01ba029b-2296-4519-b6b1-04674355258f" containerID="74bb5b1c930971f4fe9c5d05e3295a42d673f050d9c75ec7b42c0aa8e59510ca" exitCode=0 Jan 27 11:36:37 crc kubenswrapper[4775]: I0127 11:36:37.202000 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerDied","Data":"74bb5b1c930971f4fe9c5d05e3295a42d673f050d9c75ec7b42c0aa8e59510ca"} Jan 27 11:36:37 crc kubenswrapper[4775]: I0127 11:36:37.754738 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" path="/var/lib/kubelet/pods/31d3ee22-9b3b-46ac-b896-ba5c521e1753/volumes" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.088980 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-sd44h"] Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089309 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="dnsmasq-dns" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089326 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="dnsmasq-dns" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089338 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="init" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089346 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="init" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089362 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f577e755-a863-4fea-9288-6cd30168b405" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089371 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f577e755-a863-4fea-9288-6cd30168b405" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089391 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="extract-content" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089402 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="extract-content" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089412 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3cd1d9e-b735-4f90-b92a-00353e576e10" containerName="mariadb-database-create" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089417 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3cd1d9e-b735-4f90-b92a-00353e576e10" containerName="mariadb-database-create" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089429 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d04bb6-3007-42c5-9753-746a6eeb7d1c" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089435 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d04bb6-3007-42c5-9753-746a6eeb7d1c" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089444 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f208e1de-fc0e-4deb-a093-d27604b3931f" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089473 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f208e1de-fc0e-4deb-a093-d27604b3931f" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089486 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="extract-utilities" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089494 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="extract-utilities" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089508 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbde61d-aca8-4b36-8896-9c0db3e081be" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089514 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbde61d-aca8-4b36-8896-9c0db3e081be" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089523 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="registry-server" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089529 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="registry-server" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089696 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="24d04bb6-3007-42c5-9753-746a6eeb7d1c" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089713 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="dnsmasq-dns" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089727 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f208e1de-fc0e-4deb-a093-d27604b3931f" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089737 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="registry-server" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089748 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbde61d-aca8-4b36-8896-9c0db3e081be" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089765 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3cd1d9e-b735-4f90-b92a-00353e576e10" containerName="mariadb-database-create" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089776 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f577e755-a863-4fea-9288-6cd30168b405" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.090416 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.092057 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ghp7c" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.092258 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.103988 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sd44h"] Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.107118 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.107152 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgdvh\" (UniqueName: \"kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.107218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.107245 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.209342 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.209400 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgdvh\" (UniqueName: \"kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.209592 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.209633 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.214785 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.216323 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerStarted","Data":"0bbda45d64c3d5291022cfefd67ac29a65fcce1e708b8976ccb1047b144eacb1"} Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.216660 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.217974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.235084 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.251704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgdvh\" (UniqueName: \"kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.406340 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.970235 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.787559383 podStartE2EDuration="1m0.97020816s" podCreationTimestamp="2026-01-27 11:35:38 +0000 UTC" firstStartedPulling="2026-01-27 11:35:40.159252431 +0000 UTC m=+919.300850198" lastFinishedPulling="2026-01-27 11:36:00.341901198 +0000 UTC m=+939.483498975" observedRunningTime="2026-01-27 11:36:38.248285043 +0000 UTC m=+977.389882830" watchObservedRunningTime="2026-01-27 11:36:38.97020816 +0000 UTC m=+978.111805957" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.976954 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sd44h"] Jan 27 11:36:39 crc kubenswrapper[4775]: I0127 11:36:39.222068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sd44h" event={"ID":"ca5aab7c-3b7a-4996-82f5-478d4100bb6c","Type":"ContainerStarted","Data":"6240242f7a09936cfd2e2c9ff20e6303a6fa610f8151f73cb6a49267032567b6"} Jan 27 11:36:39 crc kubenswrapper[4775]: I0127 11:36:39.223202 4775 generic.go:334] "Generic (PLEG): container finished" podID="aa44a018-6958-4bee-895d-e7ec3966be8d" containerID="510f5ff2f8d44620fdee51bdb0166c2c4b4f86e61d274047b5401fdf6da98261" exitCode=0 Jan 27 11:36:39 crc kubenswrapper[4775]: I0127 11:36:39.223478 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7bdl6" event={"ID":"aa44a018-6958-4bee-895d-e7ec3966be8d","Type":"ContainerDied","Data":"510f5ff2f8d44620fdee51bdb0166c2c4b4f86e61d274047b5401fdf6da98261"} Jan 27 11:36:39 crc kubenswrapper[4775]: I0127 11:36:39.773414 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:39 crc kubenswrapper[4775]: I0127 11:36:39.825905 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.008948 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.548681 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671532 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671595 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671696 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671726 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th5jx\" (UniqueName: \"kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671781 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671830 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671893 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.672918 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.673558 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.677404 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx" (OuterVolumeSpecName: "kube-api-access-th5jx") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "kube-api-access-th5jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.693200 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts" (OuterVolumeSpecName: "scripts") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.693967 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.696196 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.708643 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774347 4775 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774404 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th5jx\" (UniqueName: \"kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774487 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774502 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774514 4775 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774525 4775 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774535 4775 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.249679 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t2tfh" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="registry-server" containerID="cri-o://8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67" gracePeriod=2 Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.250064 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7bdl6" event={"ID":"aa44a018-6958-4bee-895d-e7ec3966be8d","Type":"ContainerDied","Data":"40d78acc3513c42656eeabd0301aca54c4b90d9da6dc67b6891b3be0547d67c8"} Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.250093 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40d78acc3513c42656eeabd0301aca54c4b90d9da6dc67b6891b3be0547d67c8" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.250300 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.876629 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.892409 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.899248 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.901352 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.994249 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cjdj\" (UniqueName: \"kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj\") pod \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.994432 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content\") pod \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.994476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities\") pod \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.995955 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities" (OuterVolumeSpecName: "utilities") pod "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" (UID: "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.001218 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj" (OuterVolumeSpecName: "kube-api-access-5cjdj") pod "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" (UID: "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0"). InnerVolumeSpecName "kube-api-access-5cjdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.096987 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.097021 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cjdj\" (UniqueName: \"kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.147042 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" (UID: "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.198080 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.262729 4775 generic.go:334] "Generic (PLEG): container finished" podID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerID="8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67" exitCode=0 Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.262795 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerDied","Data":"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67"} Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.262822 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerDied","Data":"99f3348700fb94d370f10d24c119a738256680dc0ee1f38c4d297c9772b690ab"} Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.262862 4775 scope.go:117] "RemoveContainer" containerID="8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.263179 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.285980 4775 scope.go:117] "RemoveContainer" containerID="96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.308795 4775 scope.go:117] "RemoveContainer" containerID="af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.340716 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.340912 4775 scope.go:117] "RemoveContainer" containerID="8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67" Jan 27 11:36:42 crc kubenswrapper[4775]: E0127 11:36:42.341367 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67\": container with ID starting with 8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67 not found: ID does not exist" containerID="8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.341406 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67"} err="failed to get container status \"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67\": rpc error: code = NotFound desc = could not find container \"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67\": container with ID starting with 8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67 not found: ID does not exist" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.341430 4775 scope.go:117] "RemoveContainer" containerID="96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e" Jan 27 11:36:42 crc kubenswrapper[4775]: E0127 11:36:42.341840 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e\": container with ID starting with 96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e not found: ID does not exist" containerID="96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.341879 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e"} err="failed to get container status \"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e\": rpc error: code = NotFound desc = could not find container \"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e\": container with ID starting with 96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e not found: ID does not exist" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.341899 4775 scope.go:117] "RemoveContainer" containerID="af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a" Jan 27 11:36:42 crc kubenswrapper[4775]: E0127 11:36:42.342245 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a\": container with ID starting with af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a not found: ID does not exist" containerID="af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.342305 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a"} err="failed to get container status \"af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a\": rpc error: code = NotFound desc = could not find container \"af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a\": container with ID starting with af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a not found: ID does not exist" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.347557 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.482359 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 27 11:36:42 crc kubenswrapper[4775]: W0127 11:36:42.486224 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2f2b115_8dea_4dfa_a28e_5322f8fb8274.slice/crio-34447a8520ad634dab42c94126b8d402f1fbd135af073a6fcdb5fc226f4e114b WatchSource:0}: Error finding container 34447a8520ad634dab42c94126b8d402f1fbd135af073a6fcdb5fc226f4e114b: Status 404 returned error can't find the container with id 34447a8520ad634dab42c94126b8d402f1fbd135af073a6fcdb5fc226f4e114b Jan 27 11:36:43 crc kubenswrapper[4775]: I0127 11:36:43.277651 4775 generic.go:334] "Generic (PLEG): container finished" podID="83263987-4e3c-4e95-9083-bb6a43f52410" containerID="235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55" exitCode=0 Jan 27 11:36:43 crc kubenswrapper[4775]: I0127 11:36:43.277721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerDied","Data":"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55"} Jan 27 11:36:43 crc kubenswrapper[4775]: I0127 11:36:43.279662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"34447a8520ad634dab42c94126b8d402f1fbd135af073a6fcdb5fc226f4e114b"} Jan 27 11:36:43 crc kubenswrapper[4775]: I0127 11:36:43.756423 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" path="/var/lib/kubelet/pods/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0/volumes" Jan 27 11:36:43 crc kubenswrapper[4775]: I0127 11:36:43.964600 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4hqln" podUID="cacc7142-a8d4-4607-adb7-0090fbd3024a" containerName="ovn-controller" probeResult="failure" output=< Jan 27 11:36:43 crc kubenswrapper[4775]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 27 11:36:43 crc kubenswrapper[4775]: > Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.289121 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerStarted","Data":"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b"} Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.289407 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.296522 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"2190ae12109e8e1dceb559f827413fd62ef1ea37bbce5e271b7ce01d48316f0c"} Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.296577 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"53e60efb3d4da4f9c33a16ff79d7060d850dcc3d7dc90d35deb2f114cc11efec"} Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.296590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"929dca143c02dd69cb1cad1d202c4addf9831873b6d8f82600a6d97b8e48ecc2"} Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.313460 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371970.54135 podStartE2EDuration="1m6.31342735s" podCreationTimestamp="2026-01-27 11:35:38 +0000 UTC" firstStartedPulling="2026-01-27 11:35:40.42103229 +0000 UTC m=+919.562630057" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:44.309750619 +0000 UTC m=+983.451348396" watchObservedRunningTime="2026-01-27 11:36:44.31342735 +0000 UTC m=+983.455025127" Jan 27 11:36:48 crc kubenswrapper[4775]: I0127 11:36:48.949153 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4hqln" podUID="cacc7142-a8d4-4607-adb7-0090fbd3024a" containerName="ovn-controller" probeResult="failure" output=< Jan 27 11:36:48 crc kubenswrapper[4775]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 27 11:36:48 crc kubenswrapper[4775]: > Jan 27 11:36:48 crc kubenswrapper[4775]: I0127 11:36:48.957700 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:36:48 crc kubenswrapper[4775]: I0127 11:36:48.961323 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.230046 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4hqln-config-zx9sc"] Jan 27 11:36:49 crc kubenswrapper[4775]: E0127 11:36:49.230874 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="extract-utilities" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.230893 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="extract-utilities" Jan 27 11:36:49 crc kubenswrapper[4775]: E0127 11:36:49.230913 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="extract-content" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.230919 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="extract-content" Jan 27 11:36:49 crc kubenswrapper[4775]: E0127 11:36:49.230931 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa44a018-6958-4bee-895d-e7ec3966be8d" containerName="swift-ring-rebalance" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.230938 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa44a018-6958-4bee-895d-e7ec3966be8d" containerName="swift-ring-rebalance" Jan 27 11:36:49 crc kubenswrapper[4775]: E0127 11:36:49.230950 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="registry-server" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.230956 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="registry-server" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.231127 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="registry-server" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.231139 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa44a018-6958-4bee-895d-e7ec3966be8d" containerName="swift-ring-rebalance" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.234027 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.235974 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.243294 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4hqln-config-zx9sc"] Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.328949 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.329008 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdg2s\" (UniqueName: \"kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.329045 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.329260 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.329477 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.329558 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430170 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430279 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430318 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdg2s\" (UniqueName: \"kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430359 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430672 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430728 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430747 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.431347 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.432792 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.451297 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdg2s\" (UniqueName: \"kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.549850 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.566708 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.921269 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-x8mb5"] Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.922923 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.957016 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-x8mb5"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.013809 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-62xpg"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.014824 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.031239 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a920-account-create-update-7gdg6"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.032270 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.034851 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.044395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjwkj\" (UniqueName: \"kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.044469 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.061961 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-62xpg"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.080153 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a920-account-create-update-7gdg6"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.120412 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b4bd-account-create-update-lztz8"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.122329 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.124488 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146652 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7mfm\" (UniqueName: \"kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146736 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7v85\" (UniqueName: \"kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146819 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjwkj\" (UniqueName: \"kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146848 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146941 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.147213 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b4bd-account-create-update-lztz8"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.148023 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.176209 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjwkj\" (UniqueName: \"kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.232881 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kc6bw"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.237628 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.245956 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-btkr8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.246185 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.246387 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.246517 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248514 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7mfm\" (UniqueName: \"kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248564 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248601 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7v85\" (UniqueName: \"kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248667 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xnvh\" (UniqueName: \"kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.249807 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.250282 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.252545 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kc6bw"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.264065 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.307407 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7mfm\" (UniqueName: \"kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.314295 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7v85\" (UniqueName: \"kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.336034 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.350721 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-777qq\" (UniqueName: \"kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.350928 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.351013 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xnvh\" (UniqueName: \"kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.351176 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.351223 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.352640 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.360813 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.369086 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fcvx2"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.370278 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.380004 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fcvx2"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.399163 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xnvh\" (UniqueName: \"kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.439828 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.449073 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b21b-account-create-update-grvbp"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-777qq\" (UniqueName: \"kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452646 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452716 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75v8\" (UniqueName: \"kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452765 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452806 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452923 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.455862 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.456683 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.456878 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.458629 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b21b-account-create-update-grvbp"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.474132 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-777qq\" (UniqueName: \"kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.554861 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75v8\" (UniqueName: \"kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.555206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.555290 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wmmm\" (UniqueName: \"kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.555340 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.555997 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.570187 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75v8\" (UniqueName: \"kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.656658 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.656713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wmmm\" (UniqueName: \"kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.657577 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.674300 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wmmm\" (UniqueName: \"kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.702248 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.730257 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.836014 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:53 crc kubenswrapper[4775]: E0127 11:36:53.829888 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f" Jan 27 11:36:53 crc kubenswrapper[4775]: E0127 11:36:53.831353 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgdvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-sd44h_openstack(ca5aab7c-3b7a-4996-82f5-478d4100bb6c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:36:53 crc kubenswrapper[4775]: E0127 11:36:53.832769 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-sd44h" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" Jan 27 11:36:53 crc kubenswrapper[4775]: I0127 11:36:53.967792 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4hqln" podUID="cacc7142-a8d4-4607-adb7-0090fbd3024a" containerName="ovn-controller" probeResult="failure" output=< Jan 27 11:36:53 crc kubenswrapper[4775]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 27 11:36:53 crc kubenswrapper[4775]: > Jan 27 11:36:54 crc kubenswrapper[4775]: W0127 11:36:54.190913 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7d7f9ca_2e9c_4379_bee2_38cf61ed6cb2.slice/crio-f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827 WatchSource:0}: Error finding container f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827: Status 404 returned error can't find the container with id f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827 Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.191687 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fcvx2"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.370151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fcvx2" event={"ID":"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2","Type":"ContainerStarted","Data":"ac331de51381335c4691ae4e98de7332a3c5743a5d6c666d5f05ad5b3c6fd004"} Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.370191 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fcvx2" event={"ID":"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2","Type":"ContainerStarted","Data":"f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827"} Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.374423 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"35b31caed1f9b488b656fc0047e0668077c6f405a3d048c168e07f94d8f89241"} Jan 27 11:36:54 crc kubenswrapper[4775]: E0127 11:36:54.377973 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f\\\"\"" pod="openstack/glance-db-sync-sd44h" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.400258 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-fcvx2" podStartSLOduration=4.400242807 podStartE2EDuration="4.400242807s" podCreationTimestamp="2026-01-27 11:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:54.398917471 +0000 UTC m=+993.540515248" watchObservedRunningTime="2026-01-27 11:36:54.400242807 +0000 UTC m=+993.541840584" Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.560405 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a920-account-create-update-7gdg6"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.572132 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b4bd-account-create-update-lztz8"] Jan 27 11:36:54 crc kubenswrapper[4775]: W0127 11:36:54.582962 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58a046ea_e8eb_40ed_a64d_b382e0a2f331.slice/crio-645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31 WatchSource:0}: Error finding container 645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31: Status 404 returned error can't find the container with id 645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31 Jan 27 11:36:54 crc kubenswrapper[4775]: W0127 11:36:54.590432 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod066d45f9_5f72_4b81_8166_0238863b8789.slice/crio-5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a WatchSource:0}: Error finding container 5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a: Status 404 returned error can't find the container with id 5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.594387 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-62xpg"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.605511 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kc6bw"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.614006 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b21b-account-create-update-grvbp"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.624575 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-x8mb5"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.633918 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4hqln-config-zx9sc"] Jan 27 11:36:54 crc kubenswrapper[4775]: W0127 11:36:54.634965 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc495d390_f7ca_4867_b334_263c03f6b211.slice/crio-2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669 WatchSource:0}: Error finding container 2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669: Status 404 returned error can't find the container with id 2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669 Jan 27 11:36:55 crc kubenswrapper[4775]: E0127 11:36:55.147049 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc495d390_f7ca_4867_b334_263c03f6b211.slice/crio-60c3929eb191aa5a40f70277344a8ffb5cea8ddde6e12141b0847fb62fc4d0e9.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.385255 4775 generic.go:334] "Generic (PLEG): container finished" podID="58a046ea-e8eb-40ed-a64d-b382e0a2f331" containerID="8e66e5156f741145dc91fb1f4f5c4dcef2ff5bbcecc942be3a86ad151ce0efd1" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.385343 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a920-account-create-update-7gdg6" event={"ID":"58a046ea-e8eb-40ed-a64d-b382e0a2f331","Type":"ContainerDied","Data":"8e66e5156f741145dc91fb1f4f5c4dcef2ff5bbcecc942be3a86ad151ce0efd1"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.385702 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a920-account-create-update-7gdg6" event={"ID":"58a046ea-e8eb-40ed-a64d-b382e0a2f331","Type":"ContainerStarted","Data":"645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.388675 4775 generic.go:334] "Generic (PLEG): container finished" podID="c495d390-f7ca-4867-b334-263c03f6b211" containerID="60c3929eb191aa5a40f70277344a8ffb5cea8ddde6e12141b0847fb62fc4d0e9" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.388809 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-62xpg" event={"ID":"c495d390-f7ca-4867-b334-263c03f6b211","Type":"ContainerDied","Data":"60c3929eb191aa5a40f70277344a8ffb5cea8ddde6e12141b0847fb62fc4d0e9"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.388831 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-62xpg" event={"ID":"c495d390-f7ca-4867-b334-263c03f6b211","Type":"ContainerStarted","Data":"2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.391701 4775 generic.go:334] "Generic (PLEG): container finished" podID="90455f95-bcc6-4229-948c-599c91a08b2a" containerID="ba2616ca5d5b886e0ddfe23c893276ccb71fe9923291902da4fa96d4180b8ef5" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.391965 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b21b-account-create-update-grvbp" event={"ID":"90455f95-bcc6-4229-948c-599c91a08b2a","Type":"ContainerDied","Data":"ba2616ca5d5b886e0ddfe23c893276ccb71fe9923291902da4fa96d4180b8ef5"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.392039 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b21b-account-create-update-grvbp" event={"ID":"90455f95-bcc6-4229-948c-599c91a08b2a","Type":"ContainerStarted","Data":"2a2f9e2bed91c1c37a6cc52de3ebdb332b65eec57fbad353ec8207b09e08bf89"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.394090 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kc6bw" event={"ID":"71de6180-54da-4c3b-8aea-73a2ccfd936a","Type":"ContainerStarted","Data":"11caf690b8e6315d486b022511a24646b3a13ddeba5aaf5fcd9be6d3ffa4371e"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.395412 4775 generic.go:334] "Generic (PLEG): container finished" podID="a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" containerID="ac331de51381335c4691ae4e98de7332a3c5743a5d6c666d5f05ad5b3c6fd004" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.395509 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fcvx2" event={"ID":"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2","Type":"ContainerDied","Data":"ac331de51381335c4691ae4e98de7332a3c5743a5d6c666d5f05ad5b3c6fd004"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.398146 4775 generic.go:334] "Generic (PLEG): container finished" podID="d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" containerID="17411cc983dfc73db04ce363359c284ba977fc80d7b5112232e0f918ef68f140" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.398224 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-x8mb5" event={"ID":"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07","Type":"ContainerDied","Data":"17411cc983dfc73db04ce363359c284ba977fc80d7b5112232e0f918ef68f140"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.398247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-x8mb5" event={"ID":"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07","Type":"ContainerStarted","Data":"0c10f26dd6cce5cd83ce525896aef63b1c2d771d55e56bce125b0b36c7b1b426"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.408023 4775 generic.go:334] "Generic (PLEG): container finished" podID="066d45f9-5f72-4b81-8166-0238863b8789" containerID="7c55ba28687b09e9f043ff5197811f82e94f5b15d3585bb9d84c0255945f85f2" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.408163 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b4bd-account-create-update-lztz8" event={"ID":"066d45f9-5f72-4b81-8166-0238863b8789","Type":"ContainerDied","Data":"7c55ba28687b09e9f043ff5197811f82e94f5b15d3585bb9d84c0255945f85f2"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.408214 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b4bd-account-create-update-lztz8" event={"ID":"066d45f9-5f72-4b81-8166-0238863b8789","Type":"ContainerStarted","Data":"5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.411617 4775 generic.go:334] "Generic (PLEG): container finished" podID="14af2799-fccb-4f03-99f2-356e53df0f68" containerID="ae1cd59633ddddab66ae211c50fdfac95f828c364b9df14796c53c76293906ec" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.411660 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4hqln-config-zx9sc" event={"ID":"14af2799-fccb-4f03-99f2-356e53df0f68","Type":"ContainerDied","Data":"ae1cd59633ddddab66ae211c50fdfac95f828c364b9df14796c53c76293906ec"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.411689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4hqln-config-zx9sc" event={"ID":"14af2799-fccb-4f03-99f2-356e53df0f68","Type":"ContainerStarted","Data":"569bc48f314bcae5f8fec5dac1dd86102f455f3121b44f8b97ab460b30a9b972"} Jan 27 11:36:56 crc kubenswrapper[4775]: I0127 11:36:56.426691 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"98ff6ddda4279972141347d396b3579ff9da021538016933851421012d167dfd"} Jan 27 11:36:56 crc kubenswrapper[4775]: I0127 11:36:56.427129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"88474d892484c7924d01a47d7c326f452626aedd6fa0336e92fb823f6345b6ff"} Jan 27 11:36:56 crc kubenswrapper[4775]: I0127 11:36:56.427154 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"87e5393a5725c003ea3f1e8b16b6a7ac22c48a6572934a4f3eec12a89ca32650"} Jan 27 11:36:56 crc kubenswrapper[4775]: I0127 11:36:56.427175 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"4ee5eac6fa8d96f2c1251a9fa7a79066ea6f17f3f9fc3ba122ed91c65297289f"} Jan 27 11:36:58 crc kubenswrapper[4775]: I0127 11:36:58.972015 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-4hqln" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.221377 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.230994 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.258696 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.266579 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.278210 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.285243 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.289159 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335130 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335553 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335589 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7mfm\" (UniqueName: \"kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm\") pod \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335615 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts\") pod \"066d45f9-5f72-4b81-8166-0238863b8789\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335672 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xnvh\" (UniqueName: \"kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh\") pod \"066d45f9-5f72-4b81-8166-0238863b8789\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335697 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335721 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdg2s\" (UniqueName: \"kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335740 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335762 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts\") pod \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.336862 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58a046ea-e8eb-40ed-a64d-b382e0a2f331" (UID: "58a046ea-e8eb-40ed-a64d-b382e0a2f331"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.336903 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.336921 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run" (OuterVolumeSpecName: "var-run") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.339466 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "066d45f9-5f72-4b81-8166-0238863b8789" (UID: "066d45f9-5f72-4b81-8166-0238863b8789"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.339522 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.339952 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.340124 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts" (OuterVolumeSpecName: "scripts") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.346438 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm" (OuterVolumeSpecName: "kube-api-access-m7mfm") pod "58a046ea-e8eb-40ed-a64d-b382e0a2f331" (UID: "58a046ea-e8eb-40ed-a64d-b382e0a2f331"). InnerVolumeSpecName "kube-api-access-m7mfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.346515 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s" (OuterVolumeSpecName: "kube-api-access-sdg2s") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "kube-api-access-sdg2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.346541 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh" (OuterVolumeSpecName: "kube-api-access-7xnvh") pod "066d45f9-5f72-4b81-8166-0238863b8789" (UID: "066d45f9-5f72-4b81-8166-0238863b8789"). InnerVolumeSpecName "kube-api-access-7xnvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.437467 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts\") pod \"c495d390-f7ca-4867-b334-263c03f6b211\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.437559 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wmmm\" (UniqueName: \"kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm\") pod \"90455f95-bcc6-4229-948c-599c91a08b2a\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.437940 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c495d390-f7ca-4867-b334-263c03f6b211" (UID: "c495d390-f7ca-4867-b334-263c03f6b211"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438038 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts\") pod \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438108 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7v85\" (UniqueName: \"kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85\") pod \"c495d390-f7ca-4867-b334-263c03f6b211\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438215 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts\") pod \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts\") pod \"90455f95-bcc6-4229-948c-599c91a08b2a\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438353 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjwkj\" (UniqueName: \"kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj\") pod \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438585 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" (UID: "a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438633 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" (UID: "d8a7ac2f-36f7-49c5-96f9-6f8b19809b07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438743 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90455f95-bcc6-4229-948c-599c91a08b2a" (UID: "90455f95-bcc6-4229-948c-599c91a08b2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438841 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z75v8\" (UniqueName: \"kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8\") pod \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439431 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439468 4775 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439480 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7mfm\" (UniqueName: \"kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439491 4775 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439502 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439510 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439521 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xnvh\" (UniqueName: \"kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439529 4775 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439537 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdg2s\" (UniqueName: \"kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439546 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439554 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439563 4775 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439572 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439579 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.440659 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm" (OuterVolumeSpecName: "kube-api-access-2wmmm") pod "90455f95-bcc6-4229-948c-599c91a08b2a" (UID: "90455f95-bcc6-4229-948c-599c91a08b2a"). InnerVolumeSpecName "kube-api-access-2wmmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.441183 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8" (OuterVolumeSpecName: "kube-api-access-z75v8") pod "a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" (UID: "a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2"). InnerVolumeSpecName "kube-api-access-z75v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.441688 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85" (OuterVolumeSpecName: "kube-api-access-x7v85") pod "c495d390-f7ca-4867-b334-263c03f6b211" (UID: "c495d390-f7ca-4867-b334-263c03f6b211"). InnerVolumeSpecName "kube-api-access-x7v85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.448100 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj" (OuterVolumeSpecName: "kube-api-access-rjwkj") pod "d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" (UID: "d8a7ac2f-36f7-49c5-96f9-6f8b19809b07"). InnerVolumeSpecName "kube-api-access-rjwkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.449051 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.449065 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-62xpg" event={"ID":"c495d390-f7ca-4867-b334-263c03f6b211","Type":"ContainerDied","Data":"2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.449099 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.457221 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b21b-account-create-update-grvbp" event={"ID":"90455f95-bcc6-4229-948c-599c91a08b2a","Type":"ContainerDied","Data":"2a2f9e2bed91c1c37a6cc52de3ebdb332b65eec57fbad353ec8207b09e08bf89"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.457278 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a2f9e2bed91c1c37a6cc52de3ebdb332b65eec57fbad353ec8207b09e08bf89" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.457422 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.460883 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kc6bw" event={"ID":"71de6180-54da-4c3b-8aea-73a2ccfd936a","Type":"ContainerStarted","Data":"adceaeb3830c50c53d8853f905ea7baa2cdfc916451d3151ad053ea8bc41ca42"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.462685 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fcvx2" event={"ID":"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2","Type":"ContainerDied","Data":"f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.462756 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.462820 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.481053 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.481397 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-x8mb5" event={"ID":"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07","Type":"ContainerDied","Data":"0c10f26dd6cce5cd83ce525896aef63b1c2d771d55e56bce125b0b36c7b1b426"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.481481 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c10f26dd6cce5cd83ce525896aef63b1c2d771d55e56bce125b0b36c7b1b426" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.510210 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b4bd-account-create-update-lztz8" event={"ID":"066d45f9-5f72-4b81-8166-0238863b8789","Type":"ContainerDied","Data":"5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.510251 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.510301 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.513616 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4hqln-config-zx9sc" event={"ID":"14af2799-fccb-4f03-99f2-356e53df0f68","Type":"ContainerDied","Data":"569bc48f314bcae5f8fec5dac1dd86102f455f3121b44f8b97ab460b30a9b972"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.513661 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="569bc48f314bcae5f8fec5dac1dd86102f455f3121b44f8b97ab460b30a9b972" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.513721 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.535692 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a920-account-create-update-7gdg6" event={"ID":"58a046ea-e8eb-40ed-a64d-b382e0a2f331","Type":"ContainerDied","Data":"645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.535731 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.535791 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.541809 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjwkj\" (UniqueName: \"kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.541844 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z75v8\" (UniqueName: \"kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.541858 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wmmm\" (UniqueName: \"kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.541870 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7v85\" (UniqueName: \"kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.857630 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:37:00 crc kubenswrapper[4775]: I0127 11:37:00.125112 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kc6bw" podStartSLOduration=5.673631258 podStartE2EDuration="10.125093772s" podCreationTimestamp="2026-01-27 11:36:50 +0000 UTC" firstStartedPulling="2026-01-27 11:36:54.630664856 +0000 UTC m=+993.772262633" lastFinishedPulling="2026-01-27 11:36:59.08212737 +0000 UTC m=+998.223725147" observedRunningTime="2026-01-27 11:36:59.49196689 +0000 UTC m=+998.633564687" watchObservedRunningTime="2026-01-27 11:37:00.125093772 +0000 UTC m=+999.266691549" Jan 27 11:37:00 crc kubenswrapper[4775]: I0127 11:37:00.396163 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-4hqln-config-zx9sc"] Jan 27 11:37:00 crc kubenswrapper[4775]: I0127 11:37:00.420751 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-4hqln-config-zx9sc"] Jan 27 11:37:00 crc kubenswrapper[4775]: I0127 11:37:00.549627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"e07502f26c911250d5b66b3441141cc60742d2d1456821147604fadd202668c5"} Jan 27 11:37:00 crc kubenswrapper[4775]: I0127 11:37:00.549678 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"da5d969dc14b57f0d19726f5eeac3e68a8a4451ad144cbddedd792bd17c2286f"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.560820 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"3d026310fb65835934c92d3f872a5ae72a03a0f21fd058e0432dfc895d788fea"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.561178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"be6a6987d8adde43d996c27c63c7030dd8388653333f1b5cb987b62e05e6df90"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.561196 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"7fba88c01e837c306efea529653ad15f8560adfc196a2d66c87819e5f149678e"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.561207 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"69422f80f6d8360d5ef3b9b108470f8bf2b7a2e3fd1188370efb001109582875"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.561221 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"785fbff6082f4b621a054d1c6877ee393ea27dbf500b1f5df0edb5a7ed8cffb6"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.612114 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.931448964 podStartE2EDuration="37.612100201s" podCreationTimestamp="2026-01-27 11:36:24 +0000 UTC" firstStartedPulling="2026-01-27 11:36:42.48871662 +0000 UTC m=+981.630314397" lastFinishedPulling="2026-01-27 11:37:00.169367857 +0000 UTC m=+999.310965634" observedRunningTime="2026-01-27 11:37:01.6083854 +0000 UTC m=+1000.749983177" watchObservedRunningTime="2026-01-27 11:37:01.612100201 +0000 UTC m=+1000.753697978" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.756166 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14af2799-fccb-4f03-99f2-356e53df0f68" path="/var/lib/kubelet/pods/14af2799-fccb-4f03-99f2-356e53df0f68/volumes" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.899462 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.899872 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066d45f9-5f72-4b81-8166-0238863b8789" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.899899 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="066d45f9-5f72-4b81-8166-0238863b8789" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.899932 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90455f95-bcc6-4229-948c-599c91a08b2a" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.899945 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="90455f95-bcc6-4229-948c-599c91a08b2a" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.900153 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a046ea-e8eb-40ed-a64d-b382e0a2f331" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900165 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a046ea-e8eb-40ed-a64d-b382e0a2f331" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.900184 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900194 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.900214 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900223 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.900233 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c495d390-f7ca-4867-b334-263c03f6b211" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900243 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c495d390-f7ca-4867-b334-263c03f6b211" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.900263 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14af2799-fccb-4f03-99f2-356e53df0f68" containerName="ovn-config" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900273 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="14af2799-fccb-4f03-99f2-356e53df0f68" containerName="ovn-config" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900501 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a046ea-e8eb-40ed-a64d-b382e0a2f331" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900534 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="90455f95-bcc6-4229-948c-599c91a08b2a" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900549 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c495d390-f7ca-4867-b334-263c03f6b211" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900559 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900574 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900590 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="066d45f9-5f72-4b81-8166-0238863b8789" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900603 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="14af2799-fccb-4f03-99f2-356e53df0f68" containerName="ovn-config" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.901621 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.903528 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.912314 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985518 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985645 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985710 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985762 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985811 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wtgz\" (UniqueName: \"kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985849 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087555 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wtgz\" (UniqueName: \"kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087631 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087687 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087739 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087795 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087850 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.089089 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.089192 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.089210 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.089274 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.089110 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.104628 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wtgz\" (UniqueName: \"kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.226790 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.652497 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:03 crc kubenswrapper[4775]: I0127 11:37:03.583965 4775 generic.go:334] "Generic (PLEG): container finished" podID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerID="e1beaaa4695018ded22e4c66ef6c8ed9e50da3ff9c5013e8aa00be310e0383e9" exitCode=0 Jan 27 11:37:03 crc kubenswrapper[4775]: I0127 11:37:03.584025 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" event={"ID":"77bba6d5-b2fc-4cb1-a104-f61fb146ae66","Type":"ContainerDied","Data":"e1beaaa4695018ded22e4c66ef6c8ed9e50da3ff9c5013e8aa00be310e0383e9"} Jan 27 11:37:03 crc kubenswrapper[4775]: I0127 11:37:03.584414 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" event={"ID":"77bba6d5-b2fc-4cb1-a104-f61fb146ae66","Type":"ContainerStarted","Data":"f0234531f1a183116689c44e0aec6118543c01688ecd27a430619636f971edf6"} Jan 27 11:37:03 crc kubenswrapper[4775]: I0127 11:37:03.586351 4775 generic.go:334] "Generic (PLEG): container finished" podID="71de6180-54da-4c3b-8aea-73a2ccfd936a" containerID="adceaeb3830c50c53d8853f905ea7baa2cdfc916451d3151ad053ea8bc41ca42" exitCode=0 Jan 27 11:37:03 crc kubenswrapper[4775]: I0127 11:37:03.586384 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kc6bw" event={"ID":"71de6180-54da-4c3b-8aea-73a2ccfd936a","Type":"ContainerDied","Data":"adceaeb3830c50c53d8853f905ea7baa2cdfc916451d3151ad053ea8bc41ca42"} Jan 27 11:37:04 crc kubenswrapper[4775]: I0127 11:37:04.599917 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" event={"ID":"77bba6d5-b2fc-4cb1-a104-f61fb146ae66","Type":"ContainerStarted","Data":"0a1ee926e84e3e35b872d88cb08d96d4de13299dd229ad87fd274d8070f6832d"} Jan 27 11:37:04 crc kubenswrapper[4775]: I0127 11:37:04.625515 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" podStartSLOduration=3.6254439979999997 podStartE2EDuration="3.625443998s" podCreationTimestamp="2026-01-27 11:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:04.619976468 +0000 UTC m=+1003.761574285" watchObservedRunningTime="2026-01-27 11:37:04.625443998 +0000 UTC m=+1003.767041855" Jan 27 11:37:04 crc kubenswrapper[4775]: I0127 11:37:04.901576 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.042285 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle\") pod \"71de6180-54da-4c3b-8aea-73a2ccfd936a\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.042432 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data\") pod \"71de6180-54da-4c3b-8aea-73a2ccfd936a\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.042490 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-777qq\" (UniqueName: \"kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq\") pod \"71de6180-54da-4c3b-8aea-73a2ccfd936a\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.047649 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq" (OuterVolumeSpecName: "kube-api-access-777qq") pod "71de6180-54da-4c3b-8aea-73a2ccfd936a" (UID: "71de6180-54da-4c3b-8aea-73a2ccfd936a"). InnerVolumeSpecName "kube-api-access-777qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.065777 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71de6180-54da-4c3b-8aea-73a2ccfd936a" (UID: "71de6180-54da-4c3b-8aea-73a2ccfd936a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.115809 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data" (OuterVolumeSpecName: "config-data") pod "71de6180-54da-4c3b-8aea-73a2ccfd936a" (UID: "71de6180-54da-4c3b-8aea-73a2ccfd936a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.144792 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.145156 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.145429 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-777qq\" (UniqueName: \"kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.611375 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kc6bw" event={"ID":"71de6180-54da-4c3b-8aea-73a2ccfd936a","Type":"ContainerDied","Data":"11caf690b8e6315d486b022511a24646b3a13ddeba5aaf5fcd9be6d3ffa4371e"} Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.611499 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11caf690b8e6315d486b022511a24646b3a13ddeba5aaf5fcd9be6d3ffa4371e" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.611555 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.611390 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.876432 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4jgdk"] Jan 27 11:37:05 crc kubenswrapper[4775]: E0127 11:37:05.876832 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71de6180-54da-4c3b-8aea-73a2ccfd936a" containerName="keystone-db-sync" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.876851 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="71de6180-54da-4c3b-8aea-73a2ccfd936a" containerName="keystone-db-sync" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.877097 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="71de6180-54da-4c3b-8aea-73a2ccfd936a" containerName="keystone-db-sync" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.877705 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.884210 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.886039 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.886050 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.887496 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.894704 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-btkr8" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.912074 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.931702 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4jgdk"] Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.959844 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.959913 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.959952 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.959977 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.960002 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wwxq\" (UniqueName: \"kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.960031 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.031991 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.033275 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.060937 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.061002 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.061042 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.061075 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.061099 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wwxq\" (UniqueName: \"kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.061132 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.083005 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.091778 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.092354 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.092494 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.094242 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.128182 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.139058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wwxq\" (UniqueName: \"kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162286 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162354 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162429 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162465 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162487 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162512 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j59w\" (UniqueName: \"kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.195801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.220537 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.221754 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.235680 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-rrpqs" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.235943 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.236499 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.236639 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.264380 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279205 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279253 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279281 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j59w\" (UniqueName: \"kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279318 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279337 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krmx7\" (UniqueName: \"kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279392 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279416 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279443 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279481 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279545 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.280137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.284949 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.285504 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.285941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.286427 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.328315 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j59w\" (UniqueName: \"kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.349392 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.369964 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.382060 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.382110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krmx7\" (UniqueName: \"kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.382144 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.382173 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.382198 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.385944 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.386058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.386777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.387036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.387247 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.392392 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.392607 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.413766 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xbnrk"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.414694 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.420959 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.421138 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.425937 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-dtgzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.428731 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krmx7\" (UniqueName: \"kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.452365 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-2nfbz"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.453503 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.461709 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4b27z" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.461964 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.478959 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484201 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484258 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484280 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484317 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484351 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484371 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7xk8\" (UniqueName: \"kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484406 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftqgp\" (UniqueName: \"kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484421 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484470 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484487 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5n55\" (UniqueName: \"kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484522 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484542 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484565 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.485518 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2nfbz"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.496362 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xbnrk"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.514332 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.516058 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.532505 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.550734 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.583829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.590569 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.590684 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.590790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.590911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.591059 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.591226 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7xk8\" (UniqueName: \"kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593032 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfh4\" (UniqueName: \"kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593422 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftqgp\" (UniqueName: \"kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593565 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593723 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593844 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594009 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594161 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5n55\" (UniqueName: \"kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594323 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594598 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594706 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594846 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.595057 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.595171 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.613025 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.615522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.615589 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.618353 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.622254 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.624658 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.624718 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.634584 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.636546 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.640725 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.641054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.641964 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.643736 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-99pzl"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.646294 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.648236 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5n55\" (UniqueName: \"kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.649251 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.650962 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7xk8\" (UniqueName: \"kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.654161 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.654574 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rl8gp" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.681546 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.689051 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftqgp\" (UniqueName: \"kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.693622 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.695958 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697146 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfh4\" (UniqueName: \"kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697255 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697317 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697341 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697396 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvd5r\" (UniqueName: \"kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697422 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697475 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.700243 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.704725 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.704767 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.707976 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.710044 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-99pzl"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.715993 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfh4\" (UniqueName: \"kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.716340 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.720025 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.728712 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-74wvb"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.730192 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.732397 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.732768 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7dndl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.733042 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.750051 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-74wvb"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.762781 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.791365 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.799443 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.799500 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.799576 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.799595 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800122 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800153 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800171 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvd5r\" (UniqueName: \"kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800191 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800215 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800237 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-524jt\" (UniqueName: \"kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800319 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lmq8\" (UniqueName: \"kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800382 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.804071 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.806205 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.818408 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvd5r\" (UniqueName: \"kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.854374 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902123 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lmq8\" (UniqueName: \"kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902166 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902196 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902216 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902271 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902292 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902317 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902340 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902363 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902388 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902409 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-524jt\" (UniqueName: \"kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.903071 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.903415 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.903534 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.903702 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.903934 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.904209 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.912749 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.916183 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.923529 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.929511 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-524jt\" (UniqueName: \"kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.930849 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lmq8\" (UniqueName: \"kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.988290 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.019126 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.065070 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.088420 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.104339 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4jgdk"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.331215 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.368559 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xbnrk"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.663126 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2nfbz"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.665413 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4jgdk" event={"ID":"e6fc373f-0642-464e-81c9-b78a27dfebbe","Type":"ContainerStarted","Data":"4ce590bb5f4400c025e962ac68e489a2398ff50958ad307497f1941c218a2bb9"} Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.667592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerStarted","Data":"e85e8e0f44ac4f6cbdc0a4bbf06db8528c1a4b4037fff448eea7f2f74eae3616"} Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.668628 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xbnrk" event={"ID":"2029cc7b-c115-4c17-8713-c6eed291e963","Type":"ContainerStarted","Data":"3bc9015e48f89109be48fe8277a72545dd42d19ee96ca2b3cb7712694284f3b0"} Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.669467 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="dnsmasq-dns" containerID="cri-o://0a1ee926e84e3e35b872d88cb08d96d4de13299dd229ad87fd274d8070f6832d" gracePeriod=10 Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.669683 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" event={"ID":"3d2d75af-356d-4928-82f7-3555df136fac","Type":"ContainerStarted","Data":"0ec26ab5f4eca94fe87dc449d18ade5b38eb6a8d9143ab8a2b319169b716216e"} Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.684308 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.734801 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:37:07 crc kubenswrapper[4775]: W0127 11:37:07.747136 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc73cda8b_d244_4ad1_8f54_f5680565327d.slice/crio-c249bdd94a125524e988795b71a7762c676a0ef2577e0640b92316f827a03d2f WatchSource:0}: Error finding container c249bdd94a125524e988795b71a7762c676a0ef2577e0640b92316f827a03d2f: Status 404 returned error can't find the container with id c249bdd94a125524e988795b71a7762c676a0ef2577e0640b92316f827a03d2f Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.772870 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-99pzl"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.791383 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.954143 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-74wvb"] Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.143174 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.161424 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.168888 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.178154 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.229060 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.239055 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqxbs\" (UniqueName: \"kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.239110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.239152 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.239184 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.239219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.340558 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqxbs\" (UniqueName: \"kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.340608 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.340651 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.340680 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.340712 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.341411 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.341853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.342889 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.356635 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.417312 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqxbs\" (UniqueName: \"kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.538137 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.677355 4775 generic.go:334] "Generic (PLEG): container finished" podID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerID="0a1ee926e84e3e35b872d88cb08d96d4de13299dd229ad87fd274d8070f6832d" exitCode=0 Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.677428 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" event={"ID":"77bba6d5-b2fc-4cb1-a104-f61fb146ae66","Type":"ContainerDied","Data":"0a1ee926e84e3e35b872d88cb08d96d4de13299dd229ad87fd274d8070f6832d"} Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.679101 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2nfbz" event={"ID":"0edaeaa2-aa90-484f-854c-db5dd181f61b","Type":"ContainerStarted","Data":"9170c8f0fe1b93f735c76c15f9a93fc8d92b886973d63e04084aa00a5cbc88dd"} Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.680748 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-99pzl" event={"ID":"73aaf8f0-0380-4eff-875b-90da115dba37","Type":"ContainerStarted","Data":"86cd01583ba668ca9ee9332ef9ae7b46a7e4472e08a57756b024346b5439a7c6"} Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.681674 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerStarted","Data":"a1da85b3df4788f571e86de3391158e11cf2502b74702f3be38ea8d5b9dea0f2"} Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.683467 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerStarted","Data":"c249bdd94a125524e988795b71a7762c676a0ef2577e0640b92316f827a03d2f"} Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.684994 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" event={"ID":"0276dc98-8972-465b-bf5a-e222c73eb8a0","Type":"ContainerStarted","Data":"a084fdcf587e56beb20131ec45402a052d2991a945867b6c6e9adfa05c842c39"} Jan 27 11:37:10 crc kubenswrapper[4775]: W0127 11:37:10.093789 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c313125_cfde_424b_9bb3_acb232d20ba3.slice/crio-b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1 WatchSource:0}: Error finding container b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1: Status 404 returned error can't find the container with id b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1 Jan 27 11:37:10 crc kubenswrapper[4775]: I0127 11:37:10.600937 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:37:10 crc kubenswrapper[4775]: I0127 11:37:10.699510 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-74wvb" event={"ID":"5c313125-cfde-424b-9bb3-acb232d20ba3","Type":"ContainerStarted","Data":"b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1"} Jan 27 11:37:10 crc kubenswrapper[4775]: I0127 11:37:10.700636 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerStarted","Data":"96c45e8e9930bf07afed2f11987b0afd9b083256c7c2af2e8c36913249d87fa8"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.453987 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.598830 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.598931 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wtgz\" (UniqueName: \"kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.599009 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.599033 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.599092 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.599131 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.617905 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz" (OuterVolumeSpecName: "kube-api-access-6wtgz") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "kube-api-access-6wtgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.684626 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.702210 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wtgz\" (UniqueName: \"kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.702247 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.707984 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.710199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.723290 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.732302 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" event={"ID":"77bba6d5-b2fc-4cb1-a104-f61fb146ae66","Type":"ContainerDied","Data":"f0234531f1a183116689c44e0aec6118543c01688ecd27a430619636f971edf6"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.732375 4775 scope.go:117] "RemoveContainer" containerID="0a1ee926e84e3e35b872d88cb08d96d4de13299dd229ad87fd274d8070f6832d" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.732525 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.732743 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config" (OuterVolumeSpecName: "config") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.771689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4jgdk" event={"ID":"e6fc373f-0642-464e-81c9-b78a27dfebbe","Type":"ContainerStarted","Data":"4cde95c13e106ae0baf2b7a5b06242a46ab07d950f57252253895801adba497a"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.771729 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-99pzl" event={"ID":"73aaf8f0-0380-4eff-875b-90da115dba37","Type":"ContainerStarted","Data":"99a5cb170850c0b63e27c950fae2217adb226000e7879b0d85d00d895a615bdf"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.771795 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sd44h" event={"ID":"ca5aab7c-3b7a-4996-82f5-478d4100bb6c","Type":"ContainerStarted","Data":"9f638d9da6983bb9f837a053db11c7b530800ce81cdfc56efc5cba5e158a333e"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.780377 4775 scope.go:117] "RemoveContainer" containerID="e1beaaa4695018ded22e4c66ef6c8ed9e50da3ff9c5013e8aa00be310e0383e9" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.797487 4775 generic.go:334] "Generic (PLEG): container finished" podID="3d2d75af-356d-4928-82f7-3555df136fac" containerID="546bed47b4c9b681b5faa13f7edb70b50e0378571f8957808614c8f8940092a3" exitCode=0 Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.800033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" event={"ID":"3d2d75af-356d-4928-82f7-3555df136fac","Type":"ContainerDied","Data":"546bed47b4c9b681b5faa13f7edb70b50e0378571f8957808614c8f8940092a3"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.815294 4775 generic.go:334] "Generic (PLEG): container finished" podID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerID="de00e87fa01e98a6d0ad8af61db692885f2ec794526f456407919d9326501ccd" exitCode=0 Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.815665 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" event={"ID":"0276dc98-8972-465b-bf5a-e222c73eb8a0","Type":"ContainerDied","Data":"de00e87fa01e98a6d0ad8af61db692885f2ec794526f456407919d9326501ccd"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.833537 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.833576 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.833586 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.833600 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.064549 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4jgdk" podStartSLOduration=7.064531024 podStartE2EDuration="7.064531024s" podCreationTimestamp="2026-01-27 11:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:12.036233587 +0000 UTC m=+1011.177831364" watchObservedRunningTime="2026-01-27 11:37:12.064531024 +0000 UTC m=+1011.206128801" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.076385 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-sd44h" podStartSLOduration=6.726539738 podStartE2EDuration="34.06843417s" podCreationTimestamp="2026-01-27 11:36:38 +0000 UTC" firstStartedPulling="2026-01-27 11:36:38.992139762 +0000 UTC m=+978.133737539" lastFinishedPulling="2026-01-27 11:37:06.334034204 +0000 UTC m=+1005.475631971" observedRunningTime="2026-01-27 11:37:12.064085341 +0000 UTC m=+1011.205683118" watchObservedRunningTime="2026-01-27 11:37:12.06843417 +0000 UTC m=+1011.210031947" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.090062 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-99pzl" podStartSLOduration=6.090045984 podStartE2EDuration="6.090045984s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:12.081120198 +0000 UTC m=+1011.222717985" watchObservedRunningTime="2026-01-27 11:37:12.090045984 +0000 UTC m=+1011.231643761" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.176543 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.186731 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.187845 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.249517 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.249965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.250017 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.250043 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.250076 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.250121 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j59w\" (UniqueName: \"kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.265129 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w" (OuterVolumeSpecName: "kube-api-access-9j59w") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "kube-api-access-9j59w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.273405 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.287866 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.306251 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.306319 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config" (OuterVolumeSpecName: "config") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.307778 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352600 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352630 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352659 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352670 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352681 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352691 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j59w\" (UniqueName: \"kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.849943 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" event={"ID":"0276dc98-8972-465b-bf5a-e222c73eb8a0","Type":"ContainerStarted","Data":"d3b7ef27bbcce0f78db6507da3665b881a0a8b58ddbc436efa6b111cc5cd68a1"} Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.850142 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.856836 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" event={"ID":"3d2d75af-356d-4928-82f7-3555df136fac","Type":"ContainerDied","Data":"0ec26ab5f4eca94fe87dc449d18ade5b38eb6a8d9143ab8a2b319169b716216e"} Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.856887 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.856890 4775 scope.go:117] "RemoveContainer" containerID="546bed47b4c9b681b5faa13f7edb70b50e0378571f8957808614c8f8940092a3" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.877023 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" podStartSLOduration=6.876998855 podStartE2EDuration="6.876998855s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:12.867976087 +0000 UTC m=+1012.009573854" watchObservedRunningTime="2026-01-27 11:37:12.876998855 +0000 UTC m=+1012.018596632" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.942984 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.944419 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:13 crc kubenswrapper[4775]: I0127 11:37:13.803481 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d2d75af-356d-4928-82f7-3555df136fac" path="/var/lib/kubelet/pods/3d2d75af-356d-4928-82f7-3555df136fac/volumes" Jan 27 11:37:13 crc kubenswrapper[4775]: I0127 11:37:13.805055 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" path="/var/lib/kubelet/pods/77bba6d5-b2fc-4cb1-a104-f61fb146ae66/volumes" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.898753 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.928633 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:37:14 crc kubenswrapper[4775]: E0127 11:37:14.929251 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="init" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.929268 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="init" Jan 27 11:37:14 crc kubenswrapper[4775]: E0127 11:37:14.929283 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="dnsmasq-dns" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.929291 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="dnsmasq-dns" Jan 27 11:37:14 crc kubenswrapper[4775]: E0127 11:37:14.929319 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d2d75af-356d-4928-82f7-3555df136fac" containerName="init" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.929324 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2d75af-356d-4928-82f7-3555df136fac" containerName="init" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.929521 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="dnsmasq-dns" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.930766 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d2d75af-356d-4928-82f7-3555df136fac" containerName="init" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.931742 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.937128 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.945663 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.979256 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002609 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002745 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002778 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002942 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002993 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs2dr\" (UniqueName: \"kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.003064 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.018444 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6546ffcc78-4zdnk"] Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.021123 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.035064 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6546ffcc78-4zdnk"] Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.104486 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.104531 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs2dr\" (UniqueName: \"kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.104593 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.104650 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.105155 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.106000 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.107229 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.107305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.107393 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.108741 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.109758 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.114925 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.126862 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208344 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-config-data\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208433 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-combined-ca-bundle\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208474 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-tls-certs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208541 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-secret-key\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208587 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-scripts\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208638 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcwpt\" (UniqueName: \"kubernetes.io/projected/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-kube-api-access-tcwpt\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208666 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-logs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.221911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs2dr\" (UniqueName: \"kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.252275 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309530 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-config-data\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309596 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-combined-ca-bundle\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309624 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-tls-certs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309685 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-secret-key\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-scripts\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309765 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcwpt\" (UniqueName: \"kubernetes.io/projected/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-kube-api-access-tcwpt\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309812 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-logs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.310218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-logs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.310602 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-scripts\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.310686 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-config-data\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.319901 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-combined-ca-bundle\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.330105 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-secret-key\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.332610 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-tls-certs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.333554 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcwpt\" (UniqueName: \"kubernetes.io/projected/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-kube-api-access-tcwpt\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.346341 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.944009 4775 generic.go:334] "Generic (PLEG): container finished" podID="e6fc373f-0642-464e-81c9-b78a27dfebbe" containerID="4cde95c13e106ae0baf2b7a5b06242a46ab07d950f57252253895801adba497a" exitCode=0 Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.944324 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4jgdk" event={"ID":"e6fc373f-0642-464e-81c9-b78a27dfebbe","Type":"ContainerDied","Data":"4cde95c13e106ae0baf2b7a5b06242a46ab07d950f57252253895801adba497a"} Jan 27 11:37:17 crc kubenswrapper[4775]: I0127 11:37:17.022247 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:17 crc kubenswrapper[4775]: I0127 11:37:17.087577 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:37:17 crc kubenswrapper[4775]: I0127 11:37:17.087787 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" containerID="cri-o://5e752a3391827672fa37b60e71b5a6f3c1262d98795c1a40cb7662f381943f34" gracePeriod=10 Jan 27 11:37:17 crc kubenswrapper[4775]: I0127 11:37:17.969712 4775 generic.go:334] "Generic (PLEG): container finished" podID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerID="5e752a3391827672fa37b60e71b5a6f3c1262d98795c1a40cb7662f381943f34" exitCode=0 Jan 27 11:37:17 crc kubenswrapper[4775]: I0127 11:37:17.969751 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" event={"ID":"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8","Type":"ContainerDied","Data":"5e752a3391827672fa37b60e71b5a6f3c1262d98795c1a40cb7662f381943f34"} Jan 27 11:37:20 crc kubenswrapper[4775]: I0127 11:37:20.133639 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Jan 27 11:37:25 crc kubenswrapper[4775]: I0127 11:37:25.133439 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Jan 27 11:37:26 crc kubenswrapper[4775]: E0127 11:37:26.808270 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b" Jan 27 11:37:26 crc kubenswrapper[4775]: E0127 11:37:26.808483 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lmq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-74wvb_openstack(5c313125-cfde-424b-9bb3-acb232d20ba3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:37:26 crc kubenswrapper[4775]: E0127 11:37:26.809667 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-74wvb" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" Jan 27 11:37:27 crc kubenswrapper[4775]: E0127 11:37:27.050505 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b\\\"\"" pod="openstack/placement-db-sync-74wvb" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" Jan 27 11:37:27 crc kubenswrapper[4775]: E0127 11:37:27.195966 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777" Jan 27 11:37:27 crc kubenswrapper[4775]: E0127 11:37:27.196138 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n657h58dh59bh7bh68dhc8h5b9h569h564h75h75hb7hd6h597h675hb9h556h649h689h54dh554h8dh7bh9h8dh5f7h5b4h7bh5fdh8dh94h59fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z5n55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(8a82d041-4b07-491a-8af6-232e67a23299): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.350603 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.482886 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wwxq\" (UniqueName: \"kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.482969 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.482991 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.483067 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.483111 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.483149 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.489661 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.491313 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.494274 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq" (OuterVolumeSpecName: "kube-api-access-7wwxq") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "kube-api-access-7wwxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.509268 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts" (OuterVolumeSpecName: "scripts") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.510570 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.514217 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data" (OuterVolumeSpecName: "config-data") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586028 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wwxq\" (UniqueName: \"kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586181 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586273 4775 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586357 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586432 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586567 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.056439 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4jgdk" event={"ID":"e6fc373f-0642-464e-81c9-b78a27dfebbe","Type":"ContainerDied","Data":"4ce590bb5f4400c025e962ac68e489a2398ff50958ad307497f1941c218a2bb9"} Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.056492 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ce590bb5f4400c025e962ac68e489a2398ff50958ad307497f1941c218a2bb9" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.056550 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.059143 4775 generic.go:334] "Generic (PLEG): container finished" podID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" containerID="9f638d9da6983bb9f837a053db11c7b530800ce81cdfc56efc5cba5e158a333e" exitCode=0 Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.059175 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sd44h" event={"ID":"ca5aab7c-3b7a-4996-82f5-478d4100bb6c","Type":"ContainerDied","Data":"9f638d9da6983bb9f837a053db11c7b530800ce81cdfc56efc5cba5e158a333e"} Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.442345 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4jgdk"] Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.451080 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4jgdk"] Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.535228 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gcjrx"] Jan 27 11:37:28 crc kubenswrapper[4775]: E0127 11:37:28.535679 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fc373f-0642-464e-81c9-b78a27dfebbe" containerName="keystone-bootstrap" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.535696 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fc373f-0642-464e-81c9-b78a27dfebbe" containerName="keystone-bootstrap" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.535864 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6fc373f-0642-464e-81c9-b78a27dfebbe" containerName="keystone-bootstrap" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.536463 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.540104 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.540389 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.540587 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-btkr8" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.540902 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.541094 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.548837 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gcjrx"] Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.605979 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.606029 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.606059 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.606223 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.606258 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.606286 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl8lm\" (UniqueName: \"kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.707992 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.708052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.708076 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.708171 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.708191 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.708211 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl8lm\" (UniqueName: \"kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.735308 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.735311 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.735358 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.735433 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.735444 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.737984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl8lm\" (UniqueName: \"kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.853156 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:29 crc kubenswrapper[4775]: I0127 11:37:29.756478 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fc373f-0642-464e-81c9-b78a27dfebbe" path="/var/lib/kubelet/pods/e6fc373f-0642-464e-81c9-b78a27dfebbe/volumes" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.109648 4775 generic.go:334] "Generic (PLEG): container finished" podID="73aaf8f0-0380-4eff-875b-90da115dba37" containerID="99a5cb170850c0b63e27c950fae2217adb226000e7879b0d85d00d895a615bdf" exitCode=0 Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.109752 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-99pzl" event={"ID":"73aaf8f0-0380-4eff-875b-90da115dba37","Type":"ContainerDied","Data":"99a5cb170850c0b63e27c950fae2217adb226000e7879b0d85d00d895a615bdf"} Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.133502 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.133790 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:37:35 crc kubenswrapper[4775]: E0127 11:37:35.366186 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Jan 27 11:37:35 crc kubenswrapper[4775]: E0127 11:37:35.366351 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftqgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-2nfbz_openstack(0edaeaa2-aa90-484f-854c-db5dd181f61b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:37:35 crc kubenswrapper[4775]: E0127 11:37:35.367574 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-2nfbz" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.482984 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.491201 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sd44h" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.562326 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgdvh\" (UniqueName: \"kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh\") pod \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563503 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data\") pod \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config\") pod \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563616 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle\") pod \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563684 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb\") pod \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data\") pod \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563820 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb\") pod \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563852 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g82sd\" (UniqueName: \"kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd\") pod \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563875 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc\") pod \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.567999 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh" (OuterVolumeSpecName: "kube-api-access-sgdvh") pod "ca5aab7c-3b7a-4996-82f5-478d4100bb6c" (UID: "ca5aab7c-3b7a-4996-82f5-478d4100bb6c"). InnerVolumeSpecName "kube-api-access-sgdvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.568362 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ca5aab7c-3b7a-4996-82f5-478d4100bb6c" (UID: "ca5aab7c-3b7a-4996-82f5-478d4100bb6c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.569360 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd" (OuterVolumeSpecName: "kube-api-access-g82sd") pod "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" (UID: "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8"). InnerVolumeSpecName "kube-api-access-g82sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.608267 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca5aab7c-3b7a-4996-82f5-478d4100bb6c" (UID: "ca5aab7c-3b7a-4996-82f5-478d4100bb6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.613015 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" (UID: "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.617168 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config" (OuterVolumeSpecName: "config") pod "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" (UID: "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.622942 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" (UID: "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.623397 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data" (OuterVolumeSpecName: "config-data") pod "ca5aab7c-3b7a-4996-82f5-478d4100bb6c" (UID: "ca5aab7c-3b7a-4996-82f5-478d4100bb6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.625886 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" (UID: "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666151 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666188 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666202 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g82sd\" (UniqueName: \"kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666216 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666228 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgdvh\" (UniqueName: \"kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666238 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666249 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666268 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666280 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.119281 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" event={"ID":"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8","Type":"ContainerDied","Data":"28976e350fb8ecd8fa41a546d6bc48a308f3c35b6b458e7b2f0ad3f0838c3094"} Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.119323 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.119347 4775 scope.go:117] "RemoveContainer" containerID="5e752a3391827672fa37b60e71b5a6f3c1262d98795c1a40cb7662f381943f34" Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.125033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sd44h" event={"ID":"ca5aab7c-3b7a-4996-82f5-478d4100bb6c","Type":"ContainerDied","Data":"6240242f7a09936cfd2e2c9ff20e6303a6fa610f8151f73cb6a49267032567b6"} Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.125070 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6240242f7a09936cfd2e2c9ff20e6303a6fa610f8151f73cb6a49267032567b6" Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.125119 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sd44h" Jan 27 11:37:36 crc kubenswrapper[4775]: E0127 11:37:36.127229 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack/barbican-db-sync-2nfbz" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.160474 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.171276 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:36.704888 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:36.705364 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h7xk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xbnrk_openstack(2029cc7b-c115-4c17-8713-c6eed291e963): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:36.706500 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xbnrk" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.812662 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.904942 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvd5r\" (UniqueName: \"kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r\") pod \"73aaf8f0-0380-4eff-875b-90da115dba37\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.905168 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config\") pod \"73aaf8f0-0380-4eff-875b-90da115dba37\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.905243 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle\") pod \"73aaf8f0-0380-4eff-875b-90da115dba37\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.942111 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r" (OuterVolumeSpecName: "kube-api-access-kvd5r") pod "73aaf8f0-0380-4eff-875b-90da115dba37" (UID: "73aaf8f0-0380-4eff-875b-90da115dba37"). InnerVolumeSpecName "kube-api-access-kvd5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.951201 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config" (OuterVolumeSpecName: "config") pod "73aaf8f0-0380-4eff-875b-90da115dba37" (UID: "73aaf8f0-0380-4eff-875b-90da115dba37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.981294 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73aaf8f0-0380-4eff-875b-90da115dba37" (UID: "73aaf8f0-0380-4eff-875b-90da115dba37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.012847 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.012873 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.012883 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvd5r\" (UniqueName: \"kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.085179 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-d6mrq"] Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.085761 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" containerName="glance-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.085775 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" containerName="glance-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.085792 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="init" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.085797 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="init" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.085816 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73aaf8f0-0380-4eff-875b-90da115dba37" containerName="neutron-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.085823 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="73aaf8f0-0380-4eff-875b-90da115dba37" containerName="neutron-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.085832 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.085838 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.086038 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" containerName="glance-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.086052 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.086060 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="73aaf8f0-0380-4eff-875b-90da115dba37" containerName="neutron-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.087042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.103149 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-d6mrq"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.166282 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.167046 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-99pzl" event={"ID":"73aaf8f0-0380-4eff-875b-90da115dba37","Type":"ContainerDied","Data":"86cd01583ba668ca9ee9332ef9ae7b46a7e4472e08a57756b024346b5439a7c6"} Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.167077 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86cd01583ba668ca9ee9332ef9ae7b46a7e4472e08a57756b024346b5439a7c6" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.169268 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-xbnrk" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.218797 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.219120 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.219149 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.219169 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6sr\" (UniqueName: \"kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.219185 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.219229 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320557 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320617 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320647 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320676 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6sr\" (UniqueName: \"kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320691 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320727 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.323193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.324894 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.325767 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.325850 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.325998 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.339193 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-d6mrq"] Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.339809 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-zz6sr], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" podUID="957bd5b8-fe11-4f5e-b796-91f1ab9450c2" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.354130 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.356054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6sr\" (UniqueName: \"kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.356381 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.380756 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422205 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422256 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422341 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwjs5\" (UniqueName: \"kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422406 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422428 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.469997 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.487954 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.493740 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.499077 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.499331 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rl8gp" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.499517 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.499665 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.516438 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.525249 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.525283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.525330 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwjs5\" (UniqueName: \"kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.525353 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.525371 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.526462 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.526262 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.526570 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.526111 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.526577 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.527083 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.548700 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwjs5\" (UniqueName: \"kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.630123 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.630171 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.630203 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxk8k\" (UniqueName: \"kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.630224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.630493 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.694997 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.735494 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.735550 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.735580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxk8k\" (UniqueName: \"kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.735606 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.735675 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.739938 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.739987 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.750332 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.750366 4775 scope.go:117] "RemoveContainer" containerID="b5dc76210b8840ce4aa3ed6531d8e2c91e46aaffef6ddac900a9922372f2a92b" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.758470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.759019 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxk8k\" (UniqueName: \"kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.769903 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" path="/var/lib/kubelet/pods/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8/volumes" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.814157 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.961045 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.964703 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.968870 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.969830 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ghp7c" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.970091 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.994782 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046117 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046192 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046262 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046326 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjg8w\" (UniqueName: \"kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046393 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046443 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.149722 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.149767 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.149820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.149864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.149880 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjg8w\" (UniqueName: \"kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.150019 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.150061 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.153612 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.153641 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.153885 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.156787 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.160145 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.166148 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.170524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjg8w\" (UniqueName: \"kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.179176 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.181561 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.184188 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.191527 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.198089 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerStarted","Data":"67af1fcb0bcad60b4d6220dc2a58636c77413c902e1d5d58f9a296545b8c138a"} Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.209223 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.233215 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.236700 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.236607 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6546ffcc78-4zdnk"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.251895 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.251939 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.251991 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.252095 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.252131 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcgj6\" (UniqueName: \"kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.252200 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.252248 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.313927 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.322756 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gcjrx"] Jan 27 11:37:38 crc kubenswrapper[4775]: W0127 11:37:38.344414 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba461ef4_49c1_4edc_ac60_1dfb91642c46.slice/crio-0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843 WatchSource:0}: Error finding container 0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843: Status 404 returned error can't find the container with id 0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843 Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353424 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz6sr\" (UniqueName: \"kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353490 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353519 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353563 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353678 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353722 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.354175 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config" (OuterVolumeSpecName: "config") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.354552 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.354849 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356102 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356522 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356650 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356738 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356779 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356796 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcgj6\" (UniqueName: \"kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356830 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356854 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356937 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356947 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356958 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356966 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356973 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.357067 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.357771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.358649 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.376933 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.385759 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr" (OuterVolumeSpecName: "kube-api-access-zz6sr") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "kube-api-access-zz6sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.390375 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.391278 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.394109 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcgj6\" (UniqueName: \"kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.465239 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz6sr\" (UniqueName: \"kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.482671 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.593299 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.625710 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.830606 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.048112 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:39 crc kubenswrapper[4775]: W0127 11:37:39.072640 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18c65a1a_cace_450e_bd9d_b2f6824e6add.slice/crio-a3f179a7ac00e0532b27c38316f1ec93e3695e21e950a753c358b6a0f3438157 WatchSource:0}: Error finding container a3f179a7ac00e0532b27c38316f1ec93e3695e21e950a753c358b6a0f3438157: Status 404 returned error can't find the container with id a3f179a7ac00e0532b27c38316f1ec93e3695e21e950a753c358b6a0f3438157 Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.225785 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gcjrx" event={"ID":"ba461ef4-49c1-4edc-ac60-1dfb91642c46","Type":"ContainerStarted","Data":"0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.237069 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerStarted","Data":"089d2bc126411c7bc6665d485ed89d030e83e1513259c5c8f16328e6a4bd213e"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.253204 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerStarted","Data":"a3f179a7ac00e0532b27c38316f1ec93e3695e21e950a753c358b6a0f3438157"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.255967 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" event={"ID":"558b9501-01cb-43ac-aed0-f0cbc868ce59","Type":"ContainerStarted","Data":"f613d08fcd685ed44899c259a171ad733b3147458ae9f365bbc1e423524fcf00"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.261701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerStarted","Data":"5a7b8b818080f5556f5d65d07c2be8e6283d041522c2dd938c726bf295f59bde"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.262054 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58cf66fb49-4l4kc" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon-log" containerID="cri-o://5a7b8b818080f5556f5d65d07c2be8e6283d041522c2dd938c726bf295f59bde" gracePeriod=30 Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.262099 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58cf66fb49-4l4kc" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon" containerID="cri-o://27965c735360621fc3e3960fb4bac6c83e5f074ce46fbbf9d72eadc3af3a359f" gracePeriod=30 Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.264847 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6546ffcc78-4zdnk" event={"ID":"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4","Type":"ContainerStarted","Data":"c1fec8af272f03a1a22bffcede83ebeeb34fe78f0c6c8f7e8812b5c385fd5e75"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.264878 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6546ffcc78-4zdnk" event={"ID":"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4","Type":"ContainerStarted","Data":"1605ce07b00108a41ae30e35ee9b929a02da1e4e7749dbe81b9666e441c5dc91"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.268841 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerStarted","Data":"5506d184fe477b46386663b63596691c1993b133b8a155542ea5cad65532df49"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.268979 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c78fd876f-8p4lr" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon-log" containerID="cri-o://5506d184fe477b46386663b63596691c1993b133b8a155542ea5cad65532df49" gracePeriod=30 Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.269072 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c78fd876f-8p4lr" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon" containerID="cri-o://2ea013924b4f290fa084967e63882264b54bdf3e3f2ae5d4a85e13ca12cc197c" gracePeriod=30 Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.279204 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerStarted","Data":"4940cda0a55ac3bfa8b35deb3e51723cf26072d3cd145374c8d469bfb275193d"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.279243 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerStarted","Data":"98a20e3bbe057f1a1083416d0cff14282fdc9e2fca7261f4540fdf9a82145994"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.280544 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerStarted","Data":"b63cf0e89854369b83ebb263e9838c2cb8b2524c2ff119bacd1526747a2980ff"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.284254 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.284518 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerStarted","Data":"d741f03877ce7a29e41d06ab00c0d5e162e792f15a5fb3cb77d4cd2ce96127c2"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.290804 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58cf66fb49-4l4kc" podStartSLOduration=3.2905261120000002 podStartE2EDuration="33.290790994s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="2026-01-27 11:37:07.751026363 +0000 UTC m=+1006.892624150" lastFinishedPulling="2026-01-27 11:37:37.751291255 +0000 UTC m=+1036.892889032" observedRunningTime="2026-01-27 11:37:39.290353092 +0000 UTC m=+1038.431950859" watchObservedRunningTime="2026-01-27 11:37:39.290790994 +0000 UTC m=+1038.432388771" Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.333791 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-84666cddfd-6l8vq" podStartSLOduration=25.333774803 podStartE2EDuration="25.333774803s" podCreationTimestamp="2026-01-27 11:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:39.321121686 +0000 UTC m=+1038.462719473" watchObservedRunningTime="2026-01-27 11:37:39.333774803 +0000 UTC m=+1038.475372580" Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.377219 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c78fd876f-8p4lr" podStartSLOduration=4.092796103 podStartE2EDuration="33.377192873s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="2026-01-27 11:37:07.368950875 +0000 UTC m=+1006.510548642" lastFinishedPulling="2026-01-27 11:37:36.653347635 +0000 UTC m=+1035.794945412" observedRunningTime="2026-01-27 11:37:39.372817333 +0000 UTC m=+1038.514415130" watchObservedRunningTime="2026-01-27 11:37:39.377192873 +0000 UTC m=+1038.518790650" Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.517262 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-d6mrq"] Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.547876 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-d6mrq"] Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.638218 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.776556 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957bd5b8-fe11-4f5e-b796-91f1ab9450c2" path="/var/lib/kubelet/pods/957bd5b8-fe11-4f5e-b796-91f1ab9450c2/volumes" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.134187 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.307338 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gcjrx" event={"ID":"ba461ef4-49c1-4edc-ac60-1dfb91642c46","Type":"ContainerStarted","Data":"d4146f8956305fcd5ed343f07c424f8688cf68dfdc28b629aab55c50f738bb32"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.318633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerStarted","Data":"4bdedb28b55f515c534118238ae1a6d785ae6ac96c7d8f97a7f78f628958a487"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.324132 4775 generic.go:334] "Generic (PLEG): container finished" podID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerID="93626448b8ab20fd608cb51c7a09b76b9375b10a91e3ff2ab81efb1aa8fdb168" exitCode=0 Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.324403 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" event={"ID":"558b9501-01cb-43ac-aed0-f0cbc868ce59","Type":"ContainerDied","Data":"93626448b8ab20fd608cb51c7a09b76b9375b10a91e3ff2ab81efb1aa8fdb168"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.342145 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerStarted","Data":"27965c735360621fc3e3960fb4bac6c83e5f074ce46fbbf9d72eadc3af3a359f"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.364328 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gcjrx" podStartSLOduration=12.364312964 podStartE2EDuration="12.364312964s" podCreationTimestamp="2026-01-27 11:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:40.330583389 +0000 UTC m=+1039.472181166" watchObservedRunningTime="2026-01-27 11:37:40.364312964 +0000 UTC m=+1039.505910741" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.364794 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerStarted","Data":"0eb18ea0a7e8522aa14ee450ec18f20609f48386c58320c99cc54df7dfbb3f2d"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.375897 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerStarted","Data":"2ea013924b4f290fa084967e63882264b54bdf3e3f2ae5d4a85e13ca12cc197c"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.391555 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerStarted","Data":"a7a6a0a041650648d435f425352e57c5d669972574c1edc44a04c82383216931"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.392063 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.406291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerStarted","Data":"fde8c7d2735c59cd1b280864e83f75b8e5f7d802a5aa4ef9ca184716dbbfbcb2"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.419099 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66f4cff584-s28fg" podStartSLOduration=3.419075425 podStartE2EDuration="3.419075425s" podCreationTimestamp="2026-01-27 11:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:40.411094646 +0000 UTC m=+1039.552692433" watchObservedRunningTime="2026-01-27 11:37:40.419075425 +0000 UTC m=+1039.560673202" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.427668 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6546ffcc78-4zdnk" event={"ID":"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4","Type":"ContainerStarted","Data":"9bee1aab4d317b8a9716f1db1b63ae74d2cad3853b202e1ae2748039624764ee"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.435660 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerStarted","Data":"69acbe0e1dbc2111ef595f05096451e17cc913c47831643c290c11171c0a8d99"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.435805 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5f6cd994f7-2jm86" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon-log" containerID="cri-o://d741f03877ce7a29e41d06ab00c0d5e162e792f15a5fb3cb77d4cd2ce96127c2" gracePeriod=30 Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.436044 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5f6cd994f7-2jm86" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon" containerID="cri-o://69acbe0e1dbc2111ef595f05096451e17cc913c47831643c290c11171c0a8d99" gracePeriod=30 Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.468441 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6546ffcc78-4zdnk" podStartSLOduration=26.468421738 podStartE2EDuration="26.468421738s" podCreationTimestamp="2026-01-27 11:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:40.461662733 +0000 UTC m=+1039.603260520" watchObservedRunningTime="2026-01-27 11:37:40.468421738 +0000 UTC m=+1039.610019515" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.507886 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5f6cd994f7-2jm86" podStartSLOduration=5.2342435720000005 podStartE2EDuration="32.50786139s" podCreationTimestamp="2026-01-27 11:37:08 +0000 UTC" firstStartedPulling="2026-01-27 11:37:10.610110239 +0000 UTC m=+1009.751708016" lastFinishedPulling="2026-01-27 11:37:37.883728057 +0000 UTC m=+1037.025325834" observedRunningTime="2026-01-27 11:37:40.489620629 +0000 UTC m=+1039.631218426" watchObservedRunningTime="2026-01-27 11:37:40.50786139 +0000 UTC m=+1039.649459167" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.946562 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.035550 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.445556 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerStarted","Data":"f8224fd31f30d09352f5aea4baa31229da2f5e9e5507029c200c09cf75c989b1"} Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.445724 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-httpd" containerID="cri-o://f8224fd31f30d09352f5aea4baa31229da2f5e9e5507029c200c09cf75c989b1" gracePeriod=30 Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.445704 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-log" containerID="cri-o://fde8c7d2735c59cd1b280864e83f75b8e5f7d802a5aa4ef9ca184716dbbfbcb2" gracePeriod=30 Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.459989 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" event={"ID":"558b9501-01cb-43ac-aed0-f0cbc868ce59","Type":"ContainerStarted","Data":"a245340eb78d137ed3cb9c7df3352fab2464ec2b62b40355e4e4eb0fc55e898a"} Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.460142 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.466291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerStarted","Data":"3779853a99e9f3e08be331ae752e4b12549efe927c88e8d16d89e0b55ff7fac6"} Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.479173 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.479157356 podStartE2EDuration="5.479157356s" podCreationTimestamp="2026-01-27 11:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:41.471900367 +0000 UTC m=+1040.613498164" watchObservedRunningTime="2026-01-27 11:37:41.479157356 +0000 UTC m=+1040.620755133" Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.509074 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" podStartSLOduration=4.509054456 podStartE2EDuration="4.509054456s" podCreationTimestamp="2026-01-27 11:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:41.501764696 +0000 UTC m=+1040.643362493" watchObservedRunningTime="2026-01-27 11:37:41.509054456 +0000 UTC m=+1040.650652233" Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.478354 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerStarted","Data":"89b67f39524e6b44e39534c7099a066bd9bf7d085aefc4683f0121281aee95cd"} Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.478525 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-log" containerID="cri-o://3779853a99e9f3e08be331ae752e4b12549efe927c88e8d16d89e0b55ff7fac6" gracePeriod=30 Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.478551 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-httpd" containerID="cri-o://89b67f39524e6b44e39534c7099a066bd9bf7d085aefc4683f0121281aee95cd" gracePeriod=30 Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.481184 4775 generic.go:334] "Generic (PLEG): container finished" podID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerID="f8224fd31f30d09352f5aea4baa31229da2f5e9e5507029c200c09cf75c989b1" exitCode=0 Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.481226 4775 generic.go:334] "Generic (PLEG): container finished" podID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerID="fde8c7d2735c59cd1b280864e83f75b8e5f7d802a5aa4ef9ca184716dbbfbcb2" exitCode=143 Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.481558 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerDied","Data":"f8224fd31f30d09352f5aea4baa31229da2f5e9e5507029c200c09cf75c989b1"} Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.481598 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerDied","Data":"fde8c7d2735c59cd1b280864e83f75b8e5f7d802a5aa4ef9ca184716dbbfbcb2"} Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.504793 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.504774922 podStartE2EDuration="5.504774922s" podCreationTimestamp="2026-01-27 11:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:42.504108564 +0000 UTC m=+1041.645706361" watchObservedRunningTime="2026-01-27 11:37:42.504774922 +0000 UTC m=+1041.646372699" Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.902473 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.904135 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.914735 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.915025 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.922058 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000698 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvbbv\" (UniqueName: \"kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000763 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000833 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000931 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000989 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.001036 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103068 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103154 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103238 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvbbv\" (UniqueName: \"kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103267 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103302 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103342 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103388 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.109814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.120035 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.123871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.126945 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.127289 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvbbv\" (UniqueName: \"kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.136048 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.138569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.289961 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.535842 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.610097 4775 generic.go:334] "Generic (PLEG): container finished" podID="722b4859-0679-4bb0-98eb-c4168101124e" containerID="89b67f39524e6b44e39534c7099a066bd9bf7d085aefc4683f0121281aee95cd" exitCode=0 Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.610420 4775 generic.go:334] "Generic (PLEG): container finished" podID="722b4859-0679-4bb0-98eb-c4168101124e" containerID="3779853a99e9f3e08be331ae752e4b12549efe927c88e8d16d89e0b55ff7fac6" exitCode=143 Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.610513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerDied","Data":"89b67f39524e6b44e39534c7099a066bd9bf7d085aefc4683f0121281aee95cd"} Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.610542 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerDied","Data":"3779853a99e9f3e08be331ae752e4b12549efe927c88e8d16d89e0b55ff7fac6"} Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616057 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjg8w\" (UniqueName: \"kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616096 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616112 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616171 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616188 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616234 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616285 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.619895 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs" (OuterVolumeSpecName: "logs") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.624281 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts" (OuterVolumeSpecName: "scripts") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.625528 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.629609 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w" (OuterVolumeSpecName: "kube-api-access-kjg8w") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "kube-api-access-kjg8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.638098 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.638221 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.638174 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerDied","Data":"a3f179a7ac00e0532b27c38316f1ec93e3695e21e950a753c358b6a0f3438157"} Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.638341 4775 scope.go:117] "RemoveContainer" containerID="f8224fd31f30d09352f5aea4baa31229da2f5e9e5507029c200c09cf75c989b1" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.717778 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.718668 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: W0127 11:37:43.718834 4775 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/18c65a1a-cace-450e-bd9d-b2f6824e6add/volumes/kubernetes.io~secret/combined-ca-bundle Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.718853 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.720752 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjg8w\" (UniqueName: \"kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.721046 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.721076 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.721088 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.721210 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.721236 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.742648 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data" (OuterVolumeSpecName: "config-data") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.745732 4775 scope.go:117] "RemoveContainer" containerID="fde8c7d2735c59cd1b280864e83f75b8e5f7d802a5aa4ef9ca184716dbbfbcb2" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.752582 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.770345 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.822761 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgj6\" (UniqueName: \"kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823233 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823279 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823337 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823402 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823497 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.824317 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.824342 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.824941 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.826970 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs" (OuterVolumeSpecName: "logs") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.829718 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6" (OuterVolumeSpecName: "kube-api-access-xcgj6") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "kube-api-access-xcgj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.833761 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts" (OuterVolumeSpecName: "scripts") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.837947 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.860650 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.888811 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data" (OuterVolumeSpecName: "config-data") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933113 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgj6\" (UniqueName: \"kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933149 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933181 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933191 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933199 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933208 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933215 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.980607 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.993462 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.995966 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.032773 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: E0127 11:37:44.033268 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033287 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: E0127 11:37:44.033321 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033331 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: E0127 11:37:44.033349 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033357 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: E0127 11:37:44.033386 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033397 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033687 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033720 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033741 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033757 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.034931 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.035091 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.040287 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.040504 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.040673 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.132941 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136647 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136722 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136815 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2wkr\" (UniqueName: \"kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136854 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136904 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136938 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136972 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.137008 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242353 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2wkr\" (UniqueName: \"kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242416 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242504 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242531 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242562 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.243836 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.244035 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.249807 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.257292 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.257746 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.267004 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.267868 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.290285 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2wkr\" (UniqueName: \"kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.306413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.352675 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.672774 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-74wvb" event={"ID":"5c313125-cfde-424b-9bb3-acb232d20ba3","Type":"ContainerStarted","Data":"398c82449e605705da69d826d01f9e9fe98c4e413ef45b6f729de523bb9ad912"} Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.687882 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerStarted","Data":"8b2a4356eb5f8df33ebc58ad0b94e8bc53209136a336f43ded79b5472757c90d"} Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.687937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerStarted","Data":"129e86fff0154f3e4de3082e715fe1284c270556711420ae01c9066fffafb3c8"} Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.700750 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerDied","Data":"4bdedb28b55f515c534118238ae1a6d785ae6ac96c7d8f97a7f78f628958a487"} Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.700809 4775 scope.go:117] "RemoveContainer" containerID="89b67f39524e6b44e39534c7099a066bd9bf7d085aefc4683f0121281aee95cd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.700934 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.771800 4775 scope.go:117] "RemoveContainer" containerID="3779853a99e9f3e08be331ae752e4b12549efe927c88e8d16d89e0b55ff7fac6" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.784074 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-74wvb" podStartSLOduration=5.706032541 podStartE2EDuration="38.784039747s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="2026-01-27 11:37:10.095633519 +0000 UTC m=+1009.237231296" lastFinishedPulling="2026-01-27 11:37:43.173640725 +0000 UTC m=+1042.315238502" observedRunningTime="2026-01-27 11:37:44.699581762 +0000 UTC m=+1043.841179559" watchObservedRunningTime="2026-01-27 11:37:44.784039747 +0000 UTC m=+1043.925637514" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.797560 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.827859 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.854320 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.856251 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.862032 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.862512 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.867242 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964406 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964504 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7wrh\" (UniqueName: \"kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964570 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964607 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964627 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964648 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964676 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.016397 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.066677 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067038 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7wrh\" (UniqueName: \"kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067072 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067126 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067160 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067195 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067239 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067309 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067307 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.068906 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.068992 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.076307 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.085373 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.086292 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.093032 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.120024 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7wrh\" (UniqueName: \"kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.161200 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.181783 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.253814 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.253906 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.346548 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.346909 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.584555 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.711291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerStarted","Data":"815ca40b27fb4cea044b33dd23bf33c1b082f912269530f93879da29eb229030"} Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.714202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerStarted","Data":"8e118e849fbf875dde2f05c2e98a8511d2d701c095eaa63e50b73abe199d91fe"} Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.714338 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.717903 4775 generic.go:334] "Generic (PLEG): container finished" podID="ba461ef4-49c1-4edc-ac60-1dfb91642c46" containerID="d4146f8956305fcd5ed343f07c424f8688cf68dfdc28b629aab55c50f738bb32" exitCode=0 Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.717949 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gcjrx" event={"ID":"ba461ef4-49c1-4edc-ac60-1dfb91642c46","Type":"ContainerDied","Data":"d4146f8956305fcd5ed343f07c424f8688cf68dfdc28b629aab55c50f738bb32"} Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.723820 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerStarted","Data":"4be346d9744f80cbe9acdb090392b9c63c5e0cb6ed893fe6b3ae4a4e7c97ad5e"} Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.734490 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55b847b569-ccplz" podStartSLOduration=3.734474472 podStartE2EDuration="3.734474472s" podCreationTimestamp="2026-01-27 11:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:45.73185288 +0000 UTC m=+1044.873450667" watchObservedRunningTime="2026-01-27 11:37:45.734474472 +0000 UTC m=+1044.876072249" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.756356 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" path="/var/lib/kubelet/pods/18c65a1a-cace-450e-bd9d-b2f6824e6add/volumes" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.762282 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722b4859-0679-4bb0-98eb-c4168101124e" path="/var/lib/kubelet/pods/722b4859-0679-4bb0-98eb-c4168101124e/volumes" Jan 27 11:37:46 crc kubenswrapper[4775]: I0127 11:37:46.584997 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:46 crc kubenswrapper[4775]: I0127 11:37:46.745261 4775 generic.go:334] "Generic (PLEG): container finished" podID="5c313125-cfde-424b-9bb3-acb232d20ba3" containerID="398c82449e605705da69d826d01f9e9fe98c4e413ef45b6f729de523bb9ad912" exitCode=0 Jan 27 11:37:46 crc kubenswrapper[4775]: I0127 11:37:46.745348 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-74wvb" event={"ID":"5c313125-cfde-424b-9bb3-acb232d20ba3","Type":"ContainerDied","Data":"398c82449e605705da69d826d01f9e9fe98c4e413ef45b6f729de523bb9ad912"} Jan 27 11:37:46 crc kubenswrapper[4775]: I0127 11:37:46.749972 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerStarted","Data":"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b"} Jan 27 11:37:46 crc kubenswrapper[4775]: I0127 11:37:46.855202 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:47 crc kubenswrapper[4775]: I0127 11:37:47.696651 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:47 crc kubenswrapper[4775]: I0127 11:37:47.795263 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:47 crc kubenswrapper[4775]: I0127 11:37:47.795645 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="dnsmasq-dns" containerID="cri-o://d3b7ef27bbcce0f78db6507da3665b881a0a8b58ddbc436efa6b111cc5cd68a1" gracePeriod=10 Jan 27 11:37:48 crc kubenswrapper[4775]: I0127 11:37:48.538530 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:48 crc kubenswrapper[4775]: I0127 11:37:48.819143 4775 generic.go:334] "Generic (PLEG): container finished" podID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerID="d3b7ef27bbcce0f78db6507da3665b881a0a8b58ddbc436efa6b111cc5cd68a1" exitCode=0 Jan 27 11:37:48 crc kubenswrapper[4775]: I0127 11:37:48.819188 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" event={"ID":"0276dc98-8972-465b-bf5a-e222c73eb8a0","Type":"ContainerDied","Data":"d3b7ef27bbcce0f78db6507da3665b881a0a8b58ddbc436efa6b111cc5cd68a1"} Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.830606 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gcjrx" event={"ID":"ba461ef4-49c1-4edc-ac60-1dfb91642c46","Type":"ContainerDied","Data":"0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843"} Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.830880 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843" Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.834599 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-74wvb" event={"ID":"5c313125-cfde-424b-9bb3-acb232d20ba3","Type":"ContainerDied","Data":"b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1"} Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.834631 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1" Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.922464 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.959675 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984260 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984365 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984418 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl8lm\" (UniqueName: \"kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984501 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.003197 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm" (OuterVolumeSpecName: "kube-api-access-wl8lm") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "kube-api-access-wl8lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.012175 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.014597 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.025700 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts" (OuterVolumeSpecName: "scripts") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.039241 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data" (OuterVolumeSpecName: "config-data") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.039340 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.086490 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lmq8\" (UniqueName: \"kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8\") pod \"5c313125-cfde-424b-9bb3-acb232d20ba3\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.086594 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle\") pod \"5c313125-cfde-424b-9bb3-acb232d20ba3\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.086663 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts\") pod \"5c313125-cfde-424b-9bb3-acb232d20ba3\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.086738 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs\") pod \"5c313125-cfde-424b-9bb3-acb232d20ba3\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.086939 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data\") pod \"5c313125-cfde-424b-9bb3-acb232d20ba3\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087287 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087299 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087309 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087317 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl8lm\" (UniqueName: \"kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087325 4775 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087333 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.090335 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs" (OuterVolumeSpecName: "logs") pod "5c313125-cfde-424b-9bb3-acb232d20ba3" (UID: "5c313125-cfde-424b-9bb3-acb232d20ba3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.091513 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8" (OuterVolumeSpecName: "kube-api-access-9lmq8") pod "5c313125-cfde-424b-9bb3-acb232d20ba3" (UID: "5c313125-cfde-424b-9bb3-acb232d20ba3"). InnerVolumeSpecName "kube-api-access-9lmq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.096504 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts" (OuterVolumeSpecName: "scripts") pod "5c313125-cfde-424b-9bb3-acb232d20ba3" (UID: "5c313125-cfde-424b-9bb3-acb232d20ba3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.122579 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c313125-cfde-424b-9bb3-acb232d20ba3" (UID: "5c313125-cfde-424b-9bb3-acb232d20ba3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.123927 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data" (OuterVolumeSpecName: "config-data") pod "5c313125-cfde-424b-9bb3-acb232d20ba3" (UID: "5c313125-cfde-424b-9bb3-acb232d20ba3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.192194 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.192570 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lmq8\" (UniqueName: \"kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.192583 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.192593 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.192624 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.421566 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497301 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-524jt\" (UniqueName: \"kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497381 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497645 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497674 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497819 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497901 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.514211 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt" (OuterVolumeSpecName: "kube-api-access-524jt") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "kube-api-access-524jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.593824 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config" (OuterVolumeSpecName: "config") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.600893 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-524jt\" (UniqueName: \"kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.600921 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.602278 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.602913 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.607225 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.607344 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.702214 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.702583 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.702678 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.702713 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.845874 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.845861 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" event={"ID":"0276dc98-8972-465b-bf5a-e222c73eb8a0","Type":"ContainerDied","Data":"a084fdcf587e56beb20131ec45402a052d2991a945867b6c6e9adfa05c842c39"} Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.849535 4775 scope.go:117] "RemoveContainer" containerID="d3b7ef27bbcce0f78db6507da3665b881a0a8b58ddbc436efa6b111cc5cd68a1" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.845925 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.845970 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.892508 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.899612 4775 scope.go:117] "RemoveContainer" containerID="de00e87fa01e98a6d0ad8af61db692885f2ec794526f456407919d9326501ccd" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.899819 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.114870 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5994598694-dhq5v"] Jan 27 11:37:51 crc kubenswrapper[4775]: E0127 11:37:51.115616 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" containerName="placement-db-sync" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115635 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" containerName="placement-db-sync" Jan 27 11:37:51 crc kubenswrapper[4775]: E0127 11:37:51.115661 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="init" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115668 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="init" Jan 27 11:37:51 crc kubenswrapper[4775]: E0127 11:37:51.115676 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba461ef4-49c1-4edc-ac60-1dfb91642c46" containerName="keystone-bootstrap" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115683 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba461ef4-49c1-4edc-ac60-1dfb91642c46" containerName="keystone-bootstrap" Jan 27 11:37:51 crc kubenswrapper[4775]: E0127 11:37:51.115702 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="dnsmasq-dns" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115709 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="dnsmasq-dns" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115902 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="dnsmasq-dns" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115947 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" containerName="placement-db-sync" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115959 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba461ef4-49c1-4edc-ac60-1dfb91642c46" containerName="keystone-bootstrap" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.116593 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.122280 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.122548 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.122685 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.122970 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.139897 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.140240 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-btkr8" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.153701 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5994598694-dhq5v"] Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219002 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-fernet-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219098 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-public-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7wrg\" (UniqueName: \"kubernetes.io/projected/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-kube-api-access-h7wrg\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219205 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-scripts\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219242 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-credential-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219281 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-combined-ca-bundle\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219306 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-internal-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219555 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-config-data\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.263932 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.266504 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.276656 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7dndl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.277596 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.279218 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.279470 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.279545 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.281418 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321774 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321814 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-config-data\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321888 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321938 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-fernet-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321981 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-public-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7wrg\" (UniqueName: \"kubernetes.io/projected/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-kube-api-access-h7wrg\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322053 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jspwx\" (UniqueName: \"kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322119 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-scripts\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322172 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-credential-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322249 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322285 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-combined-ca-bundle\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322332 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-internal-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322418 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322492 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.335259 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-credential-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.335791 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-public-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.337127 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-fernet-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.339801 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-scripts\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.341564 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-internal-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.342126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-combined-ca-bundle\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.344677 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-config-data\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.362014 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7wrg\" (UniqueName: \"kubernetes.io/projected/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-kube-api-access-h7wrg\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.423988 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424060 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424107 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424131 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424172 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424203 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jspwx\" (UniqueName: \"kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424240 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.429187 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.429809 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.433175 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.438620 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.440197 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.441277 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.465202 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jspwx\" (UniqueName: \"kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.547052 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.643899 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.812044 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" path="/var/lib/kubelet/pods/0276dc98-8972-465b-bf5a-e222c73eb8a0/volumes" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.901353 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5994598694-dhq5v"] Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.904260 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerStarted","Data":"8ece19255413b1f459b9b434879cd49c181c9d1e505f96017ef83628747fdd1b"} Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.906532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerStarted","Data":"5244001eb3a13f0c4abc67276bce40ec6973ea3761d765924e030142c43bc5b5"} Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.934593 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerStarted","Data":"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b"} Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.968911 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.968888902 podStartE2EDuration="8.968888902s" podCreationTimestamp="2026-01-27 11:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:51.958095525 +0000 UTC m=+1051.099693312" watchObservedRunningTime="2026-01-27 11:37:51.968888902 +0000 UTC m=+1051.110486679" Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.271854 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:37:52 crc kubenswrapper[4775]: W0127 11:37:52.282953 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod926c665f_b922_4372_85aa_bbe29399eaac.slice/crio-0fb58f98d42cc735e9a9f8ee52d9b3e8b27d110f1502a0148df7a0c3e74615b7 WatchSource:0}: Error finding container 0fb58f98d42cc735e9a9f8ee52d9b3e8b27d110f1502a0148df7a0c3e74615b7: Status 404 returned error can't find the container with id 0fb58f98d42cc735e9a9f8ee52d9b3e8b27d110f1502a0148df7a0c3e74615b7 Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.968071 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerStarted","Data":"174033676be0775ea3975296e01fba15ad5de44d5394f6325f82a1a3f89deda7"} Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.968580 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerStarted","Data":"c8e562dcd249e68b0060406f3b2394c8239c0b9654b1e64e4b6a4b3e8e23ca84"} Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.968590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerStarted","Data":"0fb58f98d42cc735e9a9f8ee52d9b3e8b27d110f1502a0148df7a0c3e74615b7"} Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.968612 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.968627 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.970929 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2nfbz" event={"ID":"0edaeaa2-aa90-484f-854c-db5dd181f61b","Type":"ContainerStarted","Data":"1a9f2ed09821cb7a2fc3a6a56f74a7c65b7d39b4dfff4c1c07be78b154a6894c"} Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.979108 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerStarted","Data":"2f5a6906cc8f471f0d04ad0bdc4a6f5a9284f2bae71c74883779afada2270d60"} Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.005284 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b9b59fc66-t6rbl" podStartSLOduration=2.005263183 podStartE2EDuration="2.005263183s" podCreationTimestamp="2026-01-27 11:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:52.994878548 +0000 UTC m=+1052.136476325" watchObservedRunningTime="2026-01-27 11:37:53.005263183 +0000 UTC m=+1052.146860960" Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.012629 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xbnrk" event={"ID":"2029cc7b-c115-4c17-8713-c6eed291e963","Type":"ContainerStarted","Data":"41709560e0a135bfad172581c43697731478b69553f5d48646b5f6b88ba2d017"} Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.022811 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-2nfbz" podStartSLOduration=2.449535328 podStartE2EDuration="47.022793214s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="2026-01-27 11:37:07.688422935 +0000 UTC m=+1006.830020712" lastFinishedPulling="2026-01-27 11:37:52.261680821 +0000 UTC m=+1051.403278598" observedRunningTime="2026-01-27 11:37:53.014775664 +0000 UTC m=+1052.156373431" watchObservedRunningTime="2026-01-27 11:37:53.022793214 +0000 UTC m=+1052.164390991" Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.029500 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5994598694-dhq5v" event={"ID":"94f53f42-a5fc-45f9-b94c-4f12b63d8d75","Type":"ContainerStarted","Data":"f4e267e8e6c46c7ea5135342bbd56e5f4d8c0dc885b0f451523f5713bcaf56fb"} Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.029532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5994598694-dhq5v" event={"ID":"94f53f42-a5fc-45f9-b94c-4f12b63d8d75","Type":"ContainerStarted","Data":"7bf24c9591bfe20f8b8f6d29ed33805940d457f0f72bd837d83bf7d002869247"} Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.029571 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.055752 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.055733077 podStartE2EDuration="9.055733077s" podCreationTimestamp="2026-01-27 11:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:53.039240534 +0000 UTC m=+1052.180838331" watchObservedRunningTime="2026-01-27 11:37:53.055733077 +0000 UTC m=+1052.197330854" Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.070383 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xbnrk" podStartSLOduration=3.110992017 podStartE2EDuration="47.070365787s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="2026-01-27 11:37:07.383117263 +0000 UTC m=+1006.524715040" lastFinishedPulling="2026-01-27 11:37:51.342491033 +0000 UTC m=+1050.484088810" observedRunningTime="2026-01-27 11:37:53.063837799 +0000 UTC m=+1052.205435576" watchObservedRunningTime="2026-01-27 11:37:53.070365787 +0000 UTC m=+1052.211963554" Jan 27 11:37:54 crc kubenswrapper[4775]: I0127 11:37:54.353510 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 11:37:54 crc kubenswrapper[4775]: I0127 11:37:54.353597 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 11:37:54 crc kubenswrapper[4775]: I0127 11:37:54.397680 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 11:37:54 crc kubenswrapper[4775]: I0127 11:37:54.400345 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 11:37:54 crc kubenswrapper[4775]: I0127 11:37:54.429220 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5994598694-dhq5v" podStartSLOduration=3.429195592 podStartE2EDuration="3.429195592s" podCreationTimestamp="2026-01-27 11:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:53.086564082 +0000 UTC m=+1052.228161859" watchObservedRunningTime="2026-01-27 11:37:54.429195592 +0000 UTC m=+1053.570793369" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.045373 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.045410 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.182135 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.182509 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.221081 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.221898 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.255589 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.349039 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6546ffcc78-4zdnk" podUID="00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Jan 27 11:37:56 crc kubenswrapper[4775]: I0127 11:37:56.058662 4775 generic.go:334] "Generic (PLEG): container finished" podID="0edaeaa2-aa90-484f-854c-db5dd181f61b" containerID="1a9f2ed09821cb7a2fc3a6a56f74a7c65b7d39b4dfff4c1c07be78b154a6894c" exitCode=0 Jan 27 11:37:56 crc kubenswrapper[4775]: I0127 11:37:56.058773 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2nfbz" event={"ID":"0edaeaa2-aa90-484f-854c-db5dd181f61b","Type":"ContainerDied","Data":"1a9f2ed09821cb7a2fc3a6a56f74a7c65b7d39b4dfff4c1c07be78b154a6894c"} Jan 27 11:37:56 crc kubenswrapper[4775]: I0127 11:37:56.060331 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:56 crc kubenswrapper[4775]: I0127 11:37:56.060363 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:57 crc kubenswrapper[4775]: I0127 11:37:57.029396 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 11:37:57 crc kubenswrapper[4775]: I0127 11:37:57.070630 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:37:57 crc kubenswrapper[4775]: I0127 11:37:57.186877 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 11:37:58 crc kubenswrapper[4775]: I0127 11:37:58.086747 4775 generic.go:334] "Generic (PLEG): container finished" podID="2029cc7b-c115-4c17-8713-c6eed291e963" containerID="41709560e0a135bfad172581c43697731478b69553f5d48646b5f6b88ba2d017" exitCode=0 Jan 27 11:37:58 crc kubenswrapper[4775]: I0127 11:37:58.086869 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xbnrk" event={"ID":"2029cc7b-c115-4c17-8713-c6eed291e963","Type":"ContainerDied","Data":"41709560e0a135bfad172581c43697731478b69553f5d48646b5f6b88ba2d017"} Jan 27 11:37:58 crc kubenswrapper[4775]: I0127 11:37:58.877126 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.764494 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.775267 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.806522 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle\") pod \"0edaeaa2-aa90-484f-854c-db5dd181f61b\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.806678 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data\") pod \"0edaeaa2-aa90-484f-854c-db5dd181f61b\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.806746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftqgp\" (UniqueName: \"kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp\") pod \"0edaeaa2-aa90-484f-854c-db5dd181f61b\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.828303 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp" (OuterVolumeSpecName: "kube-api-access-ftqgp") pod "0edaeaa2-aa90-484f-854c-db5dd181f61b" (UID: "0edaeaa2-aa90-484f-854c-db5dd181f61b"). InnerVolumeSpecName "kube-api-access-ftqgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.828487 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0edaeaa2-aa90-484f-854c-db5dd181f61b" (UID: "0edaeaa2-aa90-484f-854c-db5dd181f61b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.863145 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0edaeaa2-aa90-484f-854c-db5dd181f61b" (UID: "0edaeaa2-aa90-484f-854c-db5dd181f61b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.907781 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.907818 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.907896 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.907927 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.907975 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7xk8\" (UniqueName: \"kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908092 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908231 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908501 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftqgp\" (UniqueName: \"kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908513 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908521 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908551 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.911920 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts" (OuterVolumeSpecName: "scripts") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.913027 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.915176 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8" (OuterVolumeSpecName: "kube-api-access-h7xk8") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "kube-api-access-h7xk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.932033 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.962656 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data" (OuterVolumeSpecName: "config-data") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.010878 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.010912 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.010923 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.010931 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.010939 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7xk8\" (UniqueName: \"kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.059051 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.144084 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2nfbz" event={"ID":"0edaeaa2-aa90-484f-854c-db5dd181f61b","Type":"ContainerDied","Data":"9170c8f0fe1b93f735c76c15f9a93fc8d92b886973d63e04084aa00a5cbc88dd"} Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.144119 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9170c8f0fe1b93f735c76c15f9a93fc8d92b886973d63e04084aa00a5cbc88dd" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.144167 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.158769 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xbnrk" event={"ID":"2029cc7b-c115-4c17-8713-c6eed291e963","Type":"ContainerDied","Data":"3bc9015e48f89109be48fe8277a72545dd42d19ee96ca2b3cb7712694284f3b0"} Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.159139 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bc9015e48f89109be48fe8277a72545dd42d19ee96ca2b3cb7712694284f3b0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.159207 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.567257 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:00 crc kubenswrapper[4775]: E0127 11:38:00.567865 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" containerName="cinder-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.567877 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" containerName="cinder-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: E0127 11:38:00.567891 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" containerName="barbican-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.567897 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" containerName="barbican-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.568034 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" containerName="barbican-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.568049 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" containerName="cinder-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.569304 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.572682 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-dtgzl" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.573092 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.573265 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.578507 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.608134 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623409 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623535 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623577 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623596 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623668 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvtx5\" (UniqueName: \"kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.663753 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.677023 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.677136 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724670 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvtx5\" (UniqueName: \"kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrkn\" (UniqueName: \"kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724761 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724779 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724795 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724821 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724847 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724893 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724923 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724946 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724963 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724993 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.730137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.743485 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.751697 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvtx5\" (UniqueName: \"kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.752591 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.752978 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.773262 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.773346 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.777523 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.782046 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.790241 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831103 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831170 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831222 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrkn\" (UniqueName: \"kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831288 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831333 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831366 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831389 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831430 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831488 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpt4z\" (UniqueName: \"kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831529 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831610 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.832601 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.833118 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.833578 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.833826 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.834101 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.853267 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrkn\" (UniqueName: \"kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.900325 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.910629 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.932797 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpt4z\" (UniqueName: \"kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.932843 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.932895 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.932952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.932995 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.933054 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.933069 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.933385 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.933416 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.938797 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.941062 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.942099 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.954893 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.961322 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpt4z\" (UniqueName: \"kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.014533 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.016142 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.023698 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.027704 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4b27z" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.029849 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.087581 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.123625 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.123777 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137129 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137403 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh92h\" (UniqueName: \"kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137480 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137514 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137588 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137660 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.156633 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.219845 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.236416 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.240969 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241023 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241064 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjdqm\" (UniqueName: \"kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241090 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241107 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241138 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241155 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh92h\" (UniqueName: \"kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241183 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241211 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241593 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.244516 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.245894 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.255697 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.263832 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.271552 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.272841 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.276780 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh92h\" (UniqueName: \"kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.279109 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.280983 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.299629 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.304618 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345062 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6lcx\" (UniqueName: \"kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345106 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345151 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345284 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345349 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345410 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345523 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345569 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345609 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjd8t\" (UniqueName: \"kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345682 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345723 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345773 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.347728 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.347819 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.347875 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjdqm\" (UniqueName: \"kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.347906 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.349926 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.353887 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.365332 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.368418 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjdqm\" (UniqueName: \"kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.377789 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.378174 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450234 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjd8t\" (UniqueName: \"kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450359 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450375 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450400 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450436 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450482 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6lcx\" (UniqueName: \"kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.451966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.452010 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.452805 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.453123 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.454226 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.454680 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.456623 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.457111 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.457174 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.471960 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.478241 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6lcx\" (UniqueName: \"kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.480058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjd8t\" (UniqueName: \"kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.589222 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.642658 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:02 crc kubenswrapper[4775]: I0127 11:38:02.451249 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:03 crc kubenswrapper[4775]: E0127 11:38:03.339322 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="8a82d041-4b07-491a-8af6-232e67a23299" Jan 27 11:38:03 crc kubenswrapper[4775]: I0127 11:38:03.796248 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.074862 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:38:04 crc kubenswrapper[4775]: W0127 11:38:04.078195 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48cf8210_a82a_4a2c_9c3f_b4f28cc3e4b2.slice/crio-03fcbf6ca88140ee0c7c54aff2f27534dbded1c1f0d7ef78fa4f4153e2db46f4 WatchSource:0}: Error finding container 03fcbf6ca88140ee0c7c54aff2f27534dbded1c1f0d7ef78fa4f4153e2db46f4: Status 404 returned error can't find the container with id 03fcbf6ca88140ee0c7c54aff2f27534dbded1c1f0d7ef78fa4f4153e2db46f4 Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.088985 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.101750 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.111121 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.118510 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.131932 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.221394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerStarted","Data":"7725e0d31cab8fdd988ddc82ff5c6e00f8aac8edb67890b0869f5c2b5c515d21"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.223064 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerStarted","Data":"0d2d8798bfd1e1511045000c7dea13845346445c7084205b6f8006dd91903bbd"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.224234 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" event={"ID":"53358000-8708-4b14-9f75-49ae61de192c","Type":"ContainerStarted","Data":"eb38c8a0c3a6789c718c68d92bcb1866aef03b97c45dbc45b7c10e4d35714637"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.226721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" event={"ID":"91668934-529e-4df9-b41f-8cd54e5920ea","Type":"ContainerStarted","Data":"98a47029353e8ac81c34e8a77e13a6ae144436ae57c8cc4cc8ecca40c93dad8a"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.228760 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="ceilometer-notification-agent" containerID="cri-o://089d2bc126411c7bc6665d485ed89d030e83e1513259c5c8f16328e6a4bd213e" gracePeriod=30 Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.228858 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="proxy-httpd" containerID="cri-o://443a1b23fe2193180d98684045f0c5460c5490556325375373577c0a10fc76b2" gracePeriod=30 Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.228896 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="sg-core" containerID="cri-o://5244001eb3a13f0c4abc67276bce40ec6973ea3761d765924e030142c43bc5b5" gracePeriod=30 Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.228763 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerStarted","Data":"443a1b23fe2193180d98684045f0c5460c5490556325375373577c0a10fc76b2"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.229134 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.233087 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-695f7dfd45-zbb58" event={"ID":"ac6a9582-6a97-46b4-aa84-35ca9abe695c","Type":"ContainerStarted","Data":"f81b70a5029c4a6796c226030165299ba18cc9b31e7b37fbe5cd06acf314b976"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.234273 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerStarted","Data":"2c4d9c2cc89971c922a946b8d57e98b6524b909e64d2d0876060a57a1644a6f7"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.235572 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerStarted","Data":"03fcbf6ca88140ee0c7c54aff2f27534dbded1c1f0d7ef78fa4f4153e2db46f4"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.252041 4775 generic.go:334] "Generic (PLEG): container finished" podID="8a82d041-4b07-491a-8af6-232e67a23299" containerID="443a1b23fe2193180d98684045f0c5460c5490556325375373577c0a10fc76b2" exitCode=0 Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.252647 4775 generic.go:334] "Generic (PLEG): container finished" podID="8a82d041-4b07-491a-8af6-232e67a23299" containerID="5244001eb3a13f0c4abc67276bce40ec6973ea3761d765924e030142c43bc5b5" exitCode=2 Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.252750 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerDied","Data":"443a1b23fe2193180d98684045f0c5460c5490556325375373577c0a10fc76b2"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.252808 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerDied","Data":"5244001eb3a13f0c4abc67276bce40ec6973ea3761d765924e030142c43bc5b5"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.254126 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.255092 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerStarted","Data":"d95672c7202b1212bf2392f51b8192ddd3370d76dac1053d37c2f0bc490e15b0"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.256966 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerStarted","Data":"f6b1261e70bbd30706bfbb925d2178443a048afc57cb4157e9e1e03777faecb2"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.256995 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerStarted","Data":"e8609affd7b82a5a8a2b23b648cb0dca487cea8d7f1754f41f4cbf90181492f0"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.257055 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.257264 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.264721 4775 generic.go:334] "Generic (PLEG): container finished" podID="53358000-8708-4b14-9f75-49ae61de192c" containerID="e80601e59ac96d3876f9dd39d7ba994c4c403ba5d862cc1049da791ec0bb87d3" exitCode=0 Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.264962 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" event={"ID":"53358000-8708-4b14-9f75-49ae61de192c","Type":"ContainerDied","Data":"e80601e59ac96d3876f9dd39d7ba994c4c403ba5d862cc1049da791ec0bb87d3"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.266986 4775 generic.go:334] "Generic (PLEG): container finished" podID="91668934-529e-4df9-b41f-8cd54e5920ea" containerID="88473ae1a8fc90fa959a314a4a49d93772825f6cd05e1adb0fc249904b937add" exitCode=0 Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.267014 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" event={"ID":"91668934-529e-4df9-b41f-8cd54e5920ea","Type":"ContainerDied","Data":"88473ae1a8fc90fa959a314a4a49d93772825f6cd05e1adb0fc249904b937add"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.302851 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7696d8466d-w52tt" podStartSLOduration=4.302832105 podStartE2EDuration="4.302832105s" podCreationTimestamp="2026-01-27 11:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:05.278968711 +0000 UTC m=+1064.420566498" watchObservedRunningTime="2026-01-27 11:38:05.302832105 +0000 UTC m=+1064.444429882" Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.347525 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6546ffcc78-4zdnk" podUID="00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.106795 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.151506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.151676 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.151811 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.151869 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnrkn\" (UniqueName: \"kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.151972 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.152012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.177413 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn" (OuterVolumeSpecName: "kube-api-access-cnrkn") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "kube-api-access-cnrkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.180377 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.206375 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config" (OuterVolumeSpecName: "config") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.209979 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.227714 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.238637 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257367 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257701 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnrkn\" (UniqueName: \"kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257713 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257722 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257731 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257739 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.302369 4775 generic.go:334] "Generic (PLEG): container finished" podID="8a82d041-4b07-491a-8af6-232e67a23299" containerID="089d2bc126411c7bc6665d485ed89d030e83e1513259c5c8f16328e6a4bd213e" exitCode=0 Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.302438 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerDied","Data":"089d2bc126411c7bc6665d485ed89d030e83e1513259c5c8f16328e6a4bd213e"} Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.312918 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerStarted","Data":"7095637e9396e1da41094b8f13a5d4acfb0cb246f0bca43d3c480928763afee1"} Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.313047 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api-log" containerID="cri-o://d95672c7202b1212bf2392f51b8192ddd3370d76dac1053d37c2f0bc490e15b0" gracePeriod=30 Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.313084 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api" containerID="cri-o://7095637e9396e1da41094b8f13a5d4acfb0cb246f0bca43d3c480928763afee1" gracePeriod=30 Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.313190 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.335600 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.336400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" event={"ID":"53358000-8708-4b14-9f75-49ae61de192c","Type":"ContainerDied","Data":"eb38c8a0c3a6789c718c68d92bcb1866aef03b97c45dbc45b7c10e4d35714637"} Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.336471 4775 scope.go:117] "RemoveContainer" containerID="e80601e59ac96d3876f9dd39d7ba994c4c403ba5d862cc1049da791ec0bb87d3" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.358323 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.35828949 podStartE2EDuration="6.35828949s" podCreationTimestamp="2026-01-27 11:38:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:06.337672834 +0000 UTC m=+1065.479270611" watchObservedRunningTime="2026-01-27 11:38:06.35828949 +0000 UTC m=+1065.499887267" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.462870 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.504344 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:38:06 crc kubenswrapper[4775]: E0127 11:38:06.504789 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53358000-8708-4b14-9f75-49ae61de192c" containerName="init" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.504820 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="53358000-8708-4b14-9f75-49ae61de192c" containerName="init" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.505034 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="53358000-8708-4b14-9f75-49ae61de192c" containerName="init" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.506020 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.506118 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.508724 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.508993 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.528650 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581567 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581616 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsdsn\" (UniqueName: \"kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581636 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581671 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581702 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581743 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581765 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684504 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684590 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684611 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684690 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684710 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsdsn\" (UniqueName: \"kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684763 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.688059 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.695951 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.695959 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.697131 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.698409 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.703892 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.712374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsdsn\" (UniqueName: \"kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.849349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.345041 4775 generic.go:334] "Generic (PLEG): container finished" podID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerID="7095637e9396e1da41094b8f13a5d4acfb0cb246f0bca43d3c480928763afee1" exitCode=0 Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.345078 4775 generic.go:334] "Generic (PLEG): container finished" podID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerID="d95672c7202b1212bf2392f51b8192ddd3370d76dac1053d37c2f0bc490e15b0" exitCode=143 Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.345098 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerDied","Data":"7095637e9396e1da41094b8f13a5d4acfb0cb246f0bca43d3c480928763afee1"} Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.345124 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerDied","Data":"d95672c7202b1212bf2392f51b8192ddd3370d76dac1053d37c2f0bc490e15b0"} Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.754893 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53358000-8708-4b14-9f75-49ae61de192c" path="/var/lib/kubelet/pods/53358000-8708-4b14-9f75-49ae61de192c/volumes" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.793230 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.828642 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.905316 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.905531 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5n55\" (UniqueName: \"kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.905563 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.906414 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.906575 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.906683 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.906741 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.907017 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.909151 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.909178 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.912887 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55" (OuterVolumeSpecName: "kube-api-access-z5n55") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "kube-api-access-z5n55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.919067 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts" (OuterVolumeSpecName: "scripts") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.963314 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.015017 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.015260 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.015340 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5n55\" (UniqueName: \"kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.015416 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.021677 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.062684 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data" (OuterVolumeSpecName: "config-data") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.120250 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.120288 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.154441 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.154680 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55b847b569-ccplz" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-api" containerID="cri-o://8b2a4356eb5f8df33ebc58ad0b94e8bc53209136a336f43ded79b5472757c90d" gracePeriod=30 Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.155288 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55b847b569-ccplz" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-httpd" containerID="cri-o://8e118e849fbf875dde2f05c2e98a8511d2d701c095eaa63e50b73abe199d91fe" gracePeriod=30 Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.185924 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:38:08 crc kubenswrapper[4775]: E0127 11:38:08.186409 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="proxy-httpd" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186429 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="proxy-httpd" Jan 27 11:38:08 crc kubenswrapper[4775]: E0127 11:38:08.186482 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="sg-core" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186493 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="sg-core" Jan 27 11:38:08 crc kubenswrapper[4775]: E0127 11:38:08.186512 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="ceilometer-notification-agent" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186521 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="ceilometer-notification-agent" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186770 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="ceilometer-notification-agent" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186805 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="sg-core" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186818 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="proxy-httpd" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.188079 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.197846 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.262699 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-55b847b569-ccplz" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": read tcp 10.217.0.2:45210->10.217.0.152:9696: read: connection reset by peer" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.323598 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.323974 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.324104 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.324286 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.324369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.324458 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rp82\" (UniqueName: \"kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.324553 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.368792 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" event={"ID":"91668934-529e-4df9-b41f-8cd54e5920ea","Type":"ContainerStarted","Data":"a0e92df054ede73072c8816014c71d3028937fc797e7a11e419afbd459f2f615"} Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.368864 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.389802 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerDied","Data":"a1da85b3df4788f571e86de3391158e11cf2502b74702f3be38ea8d5b9dea0f2"} Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.389832 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.389851 4775 scope.go:117] "RemoveContainer" containerID="443a1b23fe2193180d98684045f0c5460c5490556325375373577c0a10fc76b2" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.410019 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-695f7dfd45-zbb58" event={"ID":"ac6a9582-6a97-46b4-aa84-35ca9abe695c","Type":"ContainerStarted","Data":"156c73760afe4bfaf528d085e9a2fb00e063fb27928a61dc8179d4c23fd740db"} Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.420830 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.423147 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerDied","Data":"2c4d9c2cc89971c922a946b8d57e98b6524b909e64d2d0876060a57a1644a6f7"} Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.427419 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.427498 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.427523 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rp82\" (UniqueName: \"kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.427590 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.428071 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.428112 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.428209 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.436111 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.437268 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.437603 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.437753 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" podStartSLOduration=7.437734375 podStartE2EDuration="7.437734375s" podCreationTimestamp="2026-01-27 11:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:08.389217824 +0000 UTC m=+1067.530815621" watchObservedRunningTime="2026-01-27 11:38:08.437734375 +0000 UTC m=+1067.579332152" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.443524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.444034 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.452953 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.464241 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rp82\" (UniqueName: \"kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.469637 4775 scope.go:117] "RemoveContainer" containerID="5244001eb3a13f0c4abc67276bce40ec6973ea3761d765924e030142c43bc5b5" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.506576 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.514844 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.522596 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:08 crc kubenswrapper[4775]: E0127 11:38:08.523098 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.523118 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api" Jan 27 11:38:08 crc kubenswrapper[4775]: E0127 11:38:08.523136 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api-log" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.523142 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api-log" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.523369 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api-log" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.523408 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530337 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530437 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530480 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530561 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530667 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpt4z\" (UniqueName: \"kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530730 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.532591 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.535871 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs" (OuterVolumeSpecName: "logs") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.537661 4775 scope.go:117] "RemoveContainer" containerID="089d2bc126411c7bc6665d485ed89d030e83e1513259c5c8f16328e6a4bd213e" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.543544 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.549075 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts" (OuterVolumeSpecName: "scripts") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.550357 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z" (OuterVolumeSpecName: "kube-api-access-xpt4z") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "kube-api-access-xpt4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.563794 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.563975 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.567383 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.568929 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.569111 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.589733 4775 scope.go:117] "RemoveContainer" containerID="7095637e9396e1da41094b8f13a5d4acfb0cb246f0bca43d3c480928763afee1" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633764 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633807 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633837 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633853 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633901 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633918 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633942 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5szj\" (UniqueName: \"kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634012 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpt4z\" (UniqueName: \"kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634023 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634031 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634039 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634047 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634845 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.645848 4775 scope.go:117] "RemoveContainer" containerID="d95672c7202b1212bf2392f51b8192ddd3370d76dac1053d37c2f0bc490e15b0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.647762 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data" (OuterVolumeSpecName: "config-data") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.720704 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735371 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735422 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5szj\" (UniqueName: \"kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735571 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735596 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735620 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735634 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735701 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735714 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.737396 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.741851 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.742156 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.744072 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.746690 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.755739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5szj\" (UniqueName: \"kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.897166 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.263530 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.439234 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerStarted","Data":"93c0e1e738416356c2758621400d93df83887d8dd15b0d587e3b64d7e4898cf8"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.442406 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerStarted","Data":"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.442457 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerStarted","Data":"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.442468 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerStarted","Data":"7741a0906f599fd7687720fdb78021f6c23a07fdd0533bbdc83dc1e97a16a161"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.443791 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.443818 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.449016 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerStarted","Data":"9d13207bfa59faf596deb2d40a70b14097428a29e9cd2f29e431ec69fafe695f"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.449075 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerStarted","Data":"0fa47ced9f0a1a66931599424fb0e02e42c9c45fd055acdeb51c078cfec19eb2"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.456764 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-695f7dfd45-zbb58" event={"ID":"ac6a9582-6a97-46b4-aa84-35ca9abe695c","Type":"ContainerStarted","Data":"42504908b6e8629c4bfd13d446379584c5e9631e5f21f9d0d03ceb47fe02eefd"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.467524 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.477322 4775 generic.go:334] "Generic (PLEG): container finished" podID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerID="2ea013924b4f290fa084967e63882264b54bdf3e3f2ae5d4a85e13ca12cc197c" exitCode=137 Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.477360 4775 generic.go:334] "Generic (PLEG): container finished" podID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerID="5506d184fe477b46386663b63596691c1993b133b8a155542ea5cad65532df49" exitCode=137 Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.477427 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerDied","Data":"2ea013924b4f290fa084967e63882264b54bdf3e3f2ae5d4a85e13ca12cc197c"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.477469 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerDied","Data":"5506d184fe477b46386663b63596691c1993b133b8a155542ea5cad65532df49"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.478396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerStarted","Data":"53a128ffc6e310fa157dfd37a105cff396b2195c605357ef2976ef48f28caaf9"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.481875 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8bc6678d8-674l9" podStartSLOduration=3.481859139 podStartE2EDuration="3.481859139s" podCreationTimestamp="2026-01-27 11:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:09.467930387 +0000 UTC m=+1068.609528164" watchObservedRunningTime="2026-01-27 11:38:09.481859139 +0000 UTC m=+1068.623456916" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.502012 4775 generic.go:334] "Generic (PLEG): container finished" podID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerID="8e118e849fbf875dde2f05c2e98a8511d2d701c095eaa63e50b73abe199d91fe" exitCode=0 Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.502107 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerDied","Data":"8e118e849fbf875dde2f05c2e98a8511d2d701c095eaa63e50b73abe199d91fe"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.506895 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.556416 4775 generic.go:334] "Generic (PLEG): container finished" podID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerID="27965c735360621fc3e3960fb4bac6c83e5f074ce46fbbf9d72eadc3af3a359f" exitCode=137 Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.556747 4775 generic.go:334] "Generic (PLEG): container finished" podID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerID="5a7b8b818080f5556f5d65d07c2be8e6283d041522c2dd938c726bf295f59bde" exitCode=137 Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.556789 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerDied","Data":"27965c735360621fc3e3960fb4bac6c83e5f074ce46fbbf9d72eadc3af3a359f"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.556824 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerDied","Data":"5a7b8b818080f5556f5d65d07c2be8e6283d041522c2dd938c726bf295f59bde"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.573123 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-695f7dfd45-zbb58" podStartSLOduration=5.564138901 podStartE2EDuration="9.573103191s" podCreationTimestamp="2026-01-27 11:38:00 +0000 UTC" firstStartedPulling="2026-01-27 11:38:03.831957919 +0000 UTC m=+1062.973555696" lastFinishedPulling="2026-01-27 11:38:07.840922209 +0000 UTC m=+1066.982519986" observedRunningTime="2026-01-27 11:38:09.553612166 +0000 UTC m=+1068.695209943" watchObservedRunningTime="2026-01-27 11:38:09.573103191 +0000 UTC m=+1068.714700968" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.591596 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" podStartSLOduration=5.8064087650000005 podStartE2EDuration="9.591572387s" podCreationTimestamp="2026-01-27 11:38:00 +0000 UTC" firstStartedPulling="2026-01-27 11:38:04.081677137 +0000 UTC m=+1063.223274914" lastFinishedPulling="2026-01-27 11:38:07.866840759 +0000 UTC m=+1067.008438536" observedRunningTime="2026-01-27 11:38:09.526431501 +0000 UTC m=+1068.668029278" watchObservedRunningTime="2026-01-27 11:38:09.591572387 +0000 UTC m=+1068.733170164" Jan 27 11:38:09 crc kubenswrapper[4775]: W0127 11:38:09.593311 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf43a36d6_24df_43c5_9d20_aaa35c11f855.slice/crio-28b64b4fcdfe6c67d081958bb4e6c186a5ea1015e7bbc85f180d20d2234b064c WatchSource:0}: Error finding container 28b64b4fcdfe6c67d081958bb4e6c186a5ea1015e7bbc85f180d20d2234b064c: Status 404 returned error can't find the container with id 28b64b4fcdfe6c67d081958bb4e6c186a5ea1015e7bbc85f180d20d2234b064c Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.626477 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.636880 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.655078 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.656810 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.662857 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.663683 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.663952 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.664229 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.757910 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a82d041-4b07-491a-8af6-232e67a23299" path="/var/lib/kubelet/pods/8a82d041-4b07-491a-8af6-232e67a23299/volumes" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.759140 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" path="/var/lib/kubelet/pods/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be/volumes" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.759945 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760078 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760170 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760186 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760237 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760293 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760319 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wz5s\" (UniqueName: \"kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863648 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863678 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wz5s\" (UniqueName: \"kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863844 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863890 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863978 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863999 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.865252 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.866211 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.868316 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.870253 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.870602 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.871126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.875615 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.878823 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.884391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wz5s\" (UniqueName: \"kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.943960 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.030592 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.077305 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts\") pod \"c73cda8b-d244-4ad1-8f54-f5680565327d\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.077360 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs\") pod \"c73cda8b-d244-4ad1-8f54-f5680565327d\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.077387 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data\") pod \"c73cda8b-d244-4ad1-8f54-f5680565327d\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.077442 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngfh4\" (UniqueName: \"kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4\") pod \"c73cda8b-d244-4ad1-8f54-f5680565327d\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.077586 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key\") pod \"c73cda8b-d244-4ad1-8f54-f5680565327d\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.078295 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs" (OuterVolumeSpecName: "logs") pod "c73cda8b-d244-4ad1-8f54-f5680565327d" (UID: "c73cda8b-d244-4ad1-8f54-f5680565327d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.082919 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c73cda8b-d244-4ad1-8f54-f5680565327d" (UID: "c73cda8b-d244-4ad1-8f54-f5680565327d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.085438 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4" (OuterVolumeSpecName: "kube-api-access-ngfh4") pod "c73cda8b-d244-4ad1-8f54-f5680565327d" (UID: "c73cda8b-d244-4ad1-8f54-f5680565327d"). InnerVolumeSpecName "kube-api-access-ngfh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.126060 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data" (OuterVolumeSpecName: "config-data") pod "c73cda8b-d244-4ad1-8f54-f5680565327d" (UID: "c73cda8b-d244-4ad1-8f54-f5680565327d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.136728 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts" (OuterVolumeSpecName: "scripts") pod "c73cda8b-d244-4ad1-8f54-f5680565327d" (UID: "c73cda8b-d244-4ad1-8f54-f5680565327d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.179185 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.179213 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.179222 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.179231 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngfh4\" (UniqueName: \"kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.179261 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.281186 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.383794 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key\") pod \"29a2a294-6d96-4169-9be8-7109251bf8b1\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.384171 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krmx7\" (UniqueName: \"kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7\") pod \"29a2a294-6d96-4169-9be8-7109251bf8b1\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.384226 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data\") pod \"29a2a294-6d96-4169-9be8-7109251bf8b1\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.384280 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs\") pod \"29a2a294-6d96-4169-9be8-7109251bf8b1\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.384346 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts\") pod \"29a2a294-6d96-4169-9be8-7109251bf8b1\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.386766 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs" (OuterVolumeSpecName: "logs") pod "29a2a294-6d96-4169-9be8-7109251bf8b1" (UID: "29a2a294-6d96-4169-9be8-7109251bf8b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.395636 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "29a2a294-6d96-4169-9be8-7109251bf8b1" (UID: "29a2a294-6d96-4169-9be8-7109251bf8b1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.395737 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7" (OuterVolumeSpecName: "kube-api-access-krmx7") pod "29a2a294-6d96-4169-9be8-7109251bf8b1" (UID: "29a2a294-6d96-4169-9be8-7109251bf8b1"). InnerVolumeSpecName "kube-api-access-krmx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.465983 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data" (OuterVolumeSpecName: "config-data") pod "29a2a294-6d96-4169-9be8-7109251bf8b1" (UID: "29a2a294-6d96-4169-9be8-7109251bf8b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.470484 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts" (OuterVolumeSpecName: "scripts") pod "29a2a294-6d96-4169-9be8-7109251bf8b1" (UID: "29a2a294-6d96-4169-9be8-7109251bf8b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.485932 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.485954 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.485965 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krmx7\" (UniqueName: \"kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.485973 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.485981 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.581634 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.592513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerDied","Data":"c249bdd94a125524e988795b71a7762c676a0ef2577e0640b92316f827a03d2f"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.592938 4775 scope.go:117] "RemoveContainer" containerID="27965c735360621fc3e3960fb4bac6c83e5f074ce46fbbf9d72eadc3af3a359f" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.593175 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.630992 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerStarted","Data":"c9b6a0c545f10363ab83ee451af24f75b0c3422868d2657358c693fd0f9f4e66"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.631043 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerStarted","Data":"28b64b4fcdfe6c67d081958bb4e6c186a5ea1015e7bbc85f180d20d2234b064c"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.639889 4775 generic.go:334] "Generic (PLEG): container finished" podID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerID="69acbe0e1dbc2111ef595f05096451e17cc913c47831643c290c11171c0a8d99" exitCode=137 Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.639932 4775 generic.go:334] "Generic (PLEG): container finished" podID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerID="d741f03877ce7a29e41d06ab00c0d5e162e792f15a5fb3cb77d4cd2ce96127c2" exitCode=137 Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.639937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerDied","Data":"69acbe0e1dbc2111ef595f05096451e17cc913c47831643c290c11171c0a8d99"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.639987 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerDied","Data":"d741f03877ce7a29e41d06ab00c0d5e162e792f15a5fb3cb77d4cd2ce96127c2"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.647249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerStarted","Data":"91da05472e3595a00e190b2bcb487215369914030746cc03cf5ca234fe185131"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.654271 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerDied","Data":"e85e8e0f44ac4f6cbdc0a4bbf06db8528c1a4b4037fff448eea7f2f74eae3616"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.654565 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.660011 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.682191 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.684704 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.515606653 podStartE2EDuration="10.684692715s" podCreationTimestamp="2026-01-27 11:38:00 +0000 UTC" firstStartedPulling="2026-01-27 11:38:04.082072257 +0000 UTC m=+1063.223670034" lastFinishedPulling="2026-01-27 11:38:08.251158319 +0000 UTC m=+1067.392756096" observedRunningTime="2026-01-27 11:38:10.667947376 +0000 UTC m=+1069.809545153" watchObservedRunningTime="2026-01-27 11:38:10.684692715 +0000 UTC m=+1069.826290492" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.688418 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerStarted","Data":"59aabef6148d4c27f5f6e5830e2db33d7bd3fb4d58f0d43a0d6775f307bccf5f"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.688520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerStarted","Data":"0848da506d9d1e315e77e35c04fd69a834a63c3befc2e31f43e2dc6541968a23"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.690051 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.726184 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.738717 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.746789 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f57cbf767-xvk7k" podStartSLOduration=2.746769007 podStartE2EDuration="2.746769007s" podCreationTimestamp="2026-01-27 11:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:10.718531803 +0000 UTC m=+1069.860129600" watchObservedRunningTime="2026-01-27 11:38:10.746769007 +0000 UTC m=+1069.888366804" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.881628 4775 scope.go:117] "RemoveContainer" containerID="5a7b8b818080f5556f5d65d07c2be8e6283d041522c2dd938c726bf295f59bde" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.911833 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.919693 4775 scope.go:117] "RemoveContainer" containerID="2ea013924b4f290fa084967e63882264b54bdf3e3f2ae5d4a85e13ca12cc197c" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.960022 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.102003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key\") pod \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.102146 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data\") pod \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.102189 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts\") pod \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.102250 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqxbs\" (UniqueName: \"kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs\") pod \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.102297 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs\") pod \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.103337 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs" (OuterVolumeSpecName: "logs") pod "dd14daeb-9a49-4720-9c96-b6caf1257d5a" (UID: "dd14daeb-9a49-4720-9c96-b6caf1257d5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.108510 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs" (OuterVolumeSpecName: "kube-api-access-jqxbs") pod "dd14daeb-9a49-4720-9c96-b6caf1257d5a" (UID: "dd14daeb-9a49-4720-9c96-b6caf1257d5a"). InnerVolumeSpecName "kube-api-access-jqxbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.110562 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dd14daeb-9a49-4720-9c96-b6caf1257d5a" (UID: "dd14daeb-9a49-4720-9c96-b6caf1257d5a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.135697 4775 scope.go:117] "RemoveContainer" containerID="5506d184fe477b46386663b63596691c1993b133b8a155542ea5cad65532df49" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.142822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data" (OuterVolumeSpecName: "config-data") pod "dd14daeb-9a49-4720-9c96-b6caf1257d5a" (UID: "dd14daeb-9a49-4720-9c96-b6caf1257d5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.149280 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts" (OuterVolumeSpecName: "scripts") pod "dd14daeb-9a49-4720-9c96-b6caf1257d5a" (UID: "dd14daeb-9a49-4720-9c96-b6caf1257d5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.204315 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.204349 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.204358 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.204367 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.204376 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqxbs\" (UniqueName: \"kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.703617 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerStarted","Data":"9df7ce8e17e4380ee4b7c55578b2dda866d82c6471224b6ea2cb8602d082c361"} Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.708062 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerStarted","Data":"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374"} Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.708137 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerStarted","Data":"33dee4dc93223d68ed0c9843e6651623dd7c73f98dd4eee5700b9bc73cb6734c"} Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.716739 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerDied","Data":"96c45e8e9930bf07afed2f11987b0afd9b083256c7c2af2e8c36913249d87fa8"} Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.716778 4775 scope.go:117] "RemoveContainer" containerID="69acbe0e1dbc2111ef595f05096451e17cc913c47831643c290c11171c0a8d99" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.716795 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.775832 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" path="/var/lib/kubelet/pods/29a2a294-6d96-4169-9be8-7109251bf8b1/volumes" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.776482 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" path="/var/lib/kubelet/pods/c73cda8b-d244-4ad1-8f54-f5680565327d/volumes" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.887635 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.895592 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.085773 4775 scope.go:117] "RemoveContainer" containerID="d741f03877ce7a29e41d06ab00c0d5e162e792f15a5fb3cb77d4cd2ce96127c2" Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.740469 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerStarted","Data":"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290"} Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.741129 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.744723 4775 generic.go:334] "Generic (PLEG): container finished" podID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerID="8b2a4356eb5f8df33ebc58ad0b94e8bc53209136a336f43ded79b5472757c90d" exitCode=0 Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.744798 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerDied","Data":"8b2a4356eb5f8df33ebc58ad0b94e8bc53209136a336f43ded79b5472757c90d"} Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.747237 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerStarted","Data":"004b1d31e12b92a12b6611a9cd3172251cdec0a27132ad6e5347a1433fe5b67a"} Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.770954 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.770934167 podStartE2EDuration="3.770934167s" podCreationTimestamp="2026-01-27 11:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:12.764535131 +0000 UTC m=+1071.906132908" watchObservedRunningTime="2026-01-27 11:38:12.770934167 +0000 UTC m=+1071.912531934" Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.824864 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962534 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962590 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962610 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962723 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962797 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962868 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvbbv\" (UniqueName: \"kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962906 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.968305 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.974141 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv" (OuterVolumeSpecName: "kube-api-access-lvbbv") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "kube-api-access-lvbbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.016974 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.019170 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config" (OuterVolumeSpecName: "config") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.028029 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.035872 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.043183 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.064937 4775 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.064974 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.064984 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvbbv\" (UniqueName: \"kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.064997 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.065006 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.065015 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.065023 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.208319 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.365006 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.758948 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" path="/var/lib/kubelet/pods/dd14daeb-9a49-4720-9c96-b6caf1257d5a/volumes" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.761560 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.762299 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerDied","Data":"129e86fff0154f3e4de3082e715fe1284c270556711420ae01c9066fffafb3c8"} Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.762358 4775 scope.go:117] "RemoveContainer" containerID="8e118e849fbf875dde2f05c2e98a8511d2d701c095eaa63e50b73abe199d91fe" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.799986 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.810494 4775 scope.go:117] "RemoveContainer" containerID="8b2a4356eb5f8df33ebc58ad0b94e8bc53209136a336f43ded79b5472757c90d" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.817263 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:38:14 crc kubenswrapper[4775]: I0127 11:38:14.771643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerStarted","Data":"757f8cda3b6f903a401192990356764bd59a5026006946e21249f4fd71282e30"} Jan 27 11:38:14 crc kubenswrapper[4775]: I0127 11:38:14.772948 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:38:14 crc kubenswrapper[4775]: I0127 11:38:14.822085 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.921356365 podStartE2EDuration="6.822058536s" podCreationTimestamp="2026-01-27 11:38:08 +0000 UTC" firstStartedPulling="2026-01-27 11:38:09.635157143 +0000 UTC m=+1068.776754920" lastFinishedPulling="2026-01-27 11:38:13.535859314 +0000 UTC m=+1072.677457091" observedRunningTime="2026-01-27 11:38:14.806418527 +0000 UTC m=+1073.948016324" watchObservedRunningTime="2026-01-27 11:38:14.822058536 +0000 UTC m=+1073.963656313" Jan 27 11:38:15 crc kubenswrapper[4775]: I0127 11:38:15.758158 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" path="/var/lib/kubelet/pods/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c/volumes" Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.176497 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.223199 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.590625 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.638389 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.638642 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="dnsmasq-dns" containerID="cri-o://a245340eb78d137ed3cb9c7df3352fab2464ec2b62b40355e4e4eb0fc55e898a" gracePeriod=10 Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.805437 4775 generic.go:334] "Generic (PLEG): container finished" podID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerID="a245340eb78d137ed3cb9c7df3352fab2464ec2b62b40355e4e4eb0fc55e898a" exitCode=0 Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.805603 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" event={"ID":"558b9501-01cb-43ac-aed0-f0cbc868ce59","Type":"ContainerDied","Data":"a245340eb78d137ed3cb9c7df3352fab2464ec2b62b40355e4e4eb0fc55e898a"} Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.806294 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="cinder-scheduler" containerID="cri-o://93c0e1e738416356c2758621400d93df83887d8dd15b0d587e3b64d7e4898cf8" gracePeriod=30 Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.806705 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="probe" containerID="cri-o://91da05472e3595a00e190b2bcb487215369914030746cc03cf5ca234fe185131" gracePeriod=30 Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.196039 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352510 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352566 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352595 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352646 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352826 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwjs5\" (UniqueName: \"kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352867 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.387623 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5" (OuterVolumeSpecName: "kube-api-access-vwjs5") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "kube-api-access-vwjs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.411182 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.430906 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.435840 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.460823 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwjs5\" (UniqueName: \"kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.460856 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.460865 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.460874 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.462933 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.472113 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config" (OuterVolumeSpecName: "config") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.568669 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.568717 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.803796 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.806120 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.816619 4775 generic.go:334] "Generic (PLEG): container finished" podID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerID="91da05472e3595a00e190b2bcb487215369914030746cc03cf5ca234fe185131" exitCode=0 Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.816726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerDied","Data":"91da05472e3595a00e190b2bcb487215369914030746cc03cf5ca234fe185131"} Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.819726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" event={"ID":"558b9501-01cb-43ac-aed0-f0cbc868ce59","Type":"ContainerDied","Data":"f613d08fcd685ed44899c259a171ad733b3147458ae9f365bbc1e423524fcf00"} Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.819789 4775 scope.go:117] "RemoveContainer" containerID="a245340eb78d137ed3cb9c7df3352fab2464ec2b62b40355e4e4eb0fc55e898a" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.819963 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.848251 4775 scope.go:117] "RemoveContainer" containerID="93626448b8ab20fd608cb51c7a09b76b9375b10a91e3ff2ab81efb1aa8fdb168" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.881032 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.896800 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.416960 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.459584 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.521411 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.530828 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7696d8466d-w52tt" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api-log" containerID="cri-o://e8609affd7b82a5a8a2b23b648cb0dca487cea8d7f1754f41f4cbf90181492f0" gracePeriod=30 Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.531410 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7696d8466d-w52tt" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api" containerID="cri-o://f6b1261e70bbd30706bfbb925d2178443a048afc57cb4157e9e1e03777faecb2" gracePeriod=30 Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.828469 4775 generic.go:334] "Generic (PLEG): container finished" podID="837199be-1d46-4982-93ee-3f28a585d1d0" containerID="e8609affd7b82a5a8a2b23b648cb0dca487cea8d7f1754f41f4cbf90181492f0" exitCode=143 Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.828661 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerDied","Data":"e8609affd7b82a5a8a2b23b648cb0dca487cea8d7f1754f41f4cbf90181492f0"} Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.634033 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.758417 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" path="/var/lib/kubelet/pods/558b9501-01cb-43ac-aed0-f0cbc868ce59/volumes" Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.890134 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.966845 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.967122 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon-log" containerID="cri-o://b63cf0e89854369b83ebb263e9838c2cb8b2524c2ff119bacd1526747a2980ff" gracePeriod=30 Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.967236 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" containerID="cri-o://0eb18ea0a7e8522aa14ee450ec18f20609f48386c58320c99cc54df7dfbb3f2d" gracePeriod=30 Jan 27 11:38:20 crc kubenswrapper[4775]: I0127 11:38:20.866609 4775 generic.go:334] "Generic (PLEG): container finished" podID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerID="93c0e1e738416356c2758621400d93df83887d8dd15b0d587e3b64d7e4898cf8" exitCode=0 Jan 27 11:38:20 crc kubenswrapper[4775]: I0127 11:38:20.866658 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerDied","Data":"93c0e1e738416356c2758621400d93df83887d8dd15b0d587e3b64d7e4898cf8"} Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.203591 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.346633 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvtx5\" (UniqueName: \"kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.346994 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.347051 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.347098 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.347188 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.347213 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.348264 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.353465 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts" (OuterVolumeSpecName: "scripts") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.353473 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5" (OuterVolumeSpecName: "kube-api-access-xvtx5") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "kube-api-access-xvtx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.354170 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.411531 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.449249 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.449283 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.449293 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvtx5\" (UniqueName: \"kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.449304 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.449313 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.468236 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data" (OuterVolumeSpecName: "config-data") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.550816 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.700301 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7696d8466d-w52tt" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:48020->10.217.0.163:9311: read: connection reset by peer" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.702414 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7696d8466d-w52tt" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:48028->10.217.0.163:9311: read: connection reset by peer" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.869197 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.917954 4775 generic.go:334] "Generic (PLEG): container finished" podID="837199be-1d46-4982-93ee-3f28a585d1d0" containerID="f6b1261e70bbd30706bfbb925d2178443a048afc57cb4157e9e1e03777faecb2" exitCode=0 Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.918150 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerDied","Data":"f6b1261e70bbd30706bfbb925d2178443a048afc57cb4157e9e1e03777faecb2"} Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.922962 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerDied","Data":"03fcbf6ca88140ee0c7c54aff2f27534dbded1c1f0d7ef78fa4f4153e2db46f4"} Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.922998 4775 scope.go:117] "RemoveContainer" containerID="91da05472e3595a00e190b2bcb487215369914030746cc03cf5ca234fe185131" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.923110 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.963598 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.970595 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.995569 4775 scope.go:117] "RemoveContainer" containerID="93c0e1e738416356c2758621400d93df83887d8dd15b0d587e3b64d7e4898cf8" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.995675 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996011 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="probe" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996025 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="probe" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996040 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996049 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996064 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996071 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996085 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996092 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996105 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996112 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996129 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-api" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996136 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-api" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996146 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="cinder-scheduler" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996153 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="cinder-scheduler" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996165 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996173 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996183 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996192 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996210 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="dnsmasq-dns" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996219 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="dnsmasq-dns" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996228 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-httpd" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996234 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-httpd" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996248 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="init" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996257 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="init" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996484 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996495 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996512 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996520 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="dnsmasq-dns" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996531 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="probe" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996546 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996554 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996562 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="cinder-scheduler" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996573 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996583 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-api" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996591 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-httpd" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.997736 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.001919 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.031215 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163523 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163602 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163635 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163662 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctt6g\" (UniqueName: \"kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163699 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163713 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.188357 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.265473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.265985 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.266167 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.266343 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctt6g\" (UniqueName: \"kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.266513 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.266538 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.267349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.272883 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.274028 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.274058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.275533 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.285588 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctt6g\" (UniqueName: \"kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.317549 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.367409 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjd8t\" (UniqueName: \"kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t\") pod \"837199be-1d46-4982-93ee-3f28a585d1d0\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.367492 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data\") pod \"837199be-1d46-4982-93ee-3f28a585d1d0\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.367521 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle\") pod \"837199be-1d46-4982-93ee-3f28a585d1d0\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.367601 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom\") pod \"837199be-1d46-4982-93ee-3f28a585d1d0\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.367754 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs\") pod \"837199be-1d46-4982-93ee-3f28a585d1d0\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.369283 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs" (OuterVolumeSpecName: "logs") pod "837199be-1d46-4982-93ee-3f28a585d1d0" (UID: "837199be-1d46-4982-93ee-3f28a585d1d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.374274 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t" (OuterVolumeSpecName: "kube-api-access-xjd8t") pod "837199be-1d46-4982-93ee-3f28a585d1d0" (UID: "837199be-1d46-4982-93ee-3f28a585d1d0"). InnerVolumeSpecName "kube-api-access-xjd8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.376500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "837199be-1d46-4982-93ee-3f28a585d1d0" (UID: "837199be-1d46-4982-93ee-3f28a585d1d0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.397421 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "837199be-1d46-4982-93ee-3f28a585d1d0" (UID: "837199be-1d46-4982-93ee-3f28a585d1d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.438888 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data" (OuterVolumeSpecName: "config-data") pod "837199be-1d46-4982-93ee-3f28a585d1d0" (UID: "837199be-1d46-4982-93ee-3f28a585d1d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.470345 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.470378 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjd8t\" (UniqueName: \"kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.470406 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.470416 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.470426 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.780006 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:22 crc kubenswrapper[4775]: W0127 11:38:22.784505 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7287c167_2d78_4766_b072_0762f4c4d504.slice/crio-bd8ae068132a0cde3e52d3eb2417624a42f137d7a8867511b824433d3a994398 WatchSource:0}: Error finding container bd8ae068132a0cde3e52d3eb2417624a42f137d7a8867511b824433d3a994398: Status 404 returned error can't find the container with id bd8ae068132a0cde3e52d3eb2417624a42f137d7a8867511b824433d3a994398 Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.934678 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerStarted","Data":"bd8ae068132a0cde3e52d3eb2417624a42f137d7a8867511b824433d3a994398"} Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.936129 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.936129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerDied","Data":"0d2d8798bfd1e1511045000c7dea13845346445c7084205b6f8006dd91903bbd"} Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.936190 4775 scope.go:117] "RemoveContainer" containerID="f6b1261e70bbd30706bfbb925d2178443a048afc57cb4157e9e1e03777faecb2" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.972186 4775 scope.go:117] "RemoveContainer" containerID="e8609affd7b82a5a8a2b23b648cb0dca487cea8d7f1754f41f4cbf90181492f0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.974168 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.981004 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.226284 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.380661 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.758174 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" path="/var/lib/kubelet/pods/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2/volumes" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.759397 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" path="/var/lib/kubelet/pods/837199be-1d46-4982-93ee-3f28a585d1d0/volumes" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.792214 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:23 crc kubenswrapper[4775]: E0127 11:38:23.792645 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.792658 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api" Jan 27 11:38:23 crc kubenswrapper[4775]: E0127 11:38:23.792678 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api-log" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.792686 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api-log" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.792888 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.792908 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api-log" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.793575 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.796349 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.796650 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-j8z7v" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.798065 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.805015 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.901786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.901875 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.901948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdqk\" (UniqueName: \"kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.901978 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.953498 4775 generic.go:334] "Generic (PLEG): container finished" podID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerID="0eb18ea0a7e8522aa14ee450ec18f20609f48386c58320c99cc54df7dfbb3f2d" exitCode=0 Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.953558 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerDied","Data":"0eb18ea0a7e8522aa14ee450ec18f20609f48386c58320c99cc54df7dfbb3f2d"} Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.959856 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerStarted","Data":"bbb066bf267b9b4c21870b464097c872ce5e07c929ddc57dfd10b2d4417b3e8c"} Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.003269 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.003308 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdqk\" (UniqueName: \"kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.003356 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.003434 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.004347 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.007823 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.007936 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.021711 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdqk\" (UniqueName: \"kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.049967 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.050683 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.060701 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.146385 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.147560 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.156679 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.191159 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:38:24 crc kubenswrapper[4775]: E0127 11:38:24.196871 4775 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 27 11:38:24 crc kubenswrapper[4775]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_66dbc089-aa1e-46ef-a8a8-c3fdb1f590af_0(539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634" Netns:"/var/run/netns/ae8e53bd-aec2-44a5-9f5b-93c2c01aba92" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634;K8S_POD_UID=66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af]: expected pod UID "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" but got "db40a4a8-ce91-40a6-8b63-ccc17ed327da" from Kube API Jan 27 11:38:24 crc kubenswrapper[4775]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 27 11:38:24 crc kubenswrapper[4775]: > Jan 27 11:38:24 crc kubenswrapper[4775]: E0127 11:38:24.196938 4775 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 27 11:38:24 crc kubenswrapper[4775]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_66dbc089-aa1e-46ef-a8a8-c3fdb1f590af_0(539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634" Netns:"/var/run/netns/ae8e53bd-aec2-44a5-9f5b-93c2c01aba92" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634;K8S_POD_UID=66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af]: expected pod UID "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" but got "db40a4a8-ce91-40a6-8b63-ccc17ed327da" from Kube API Jan 27 11:38:24 crc kubenswrapper[4775]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 27 11:38:24 crc kubenswrapper[4775]: > pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.308416 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgxv7\" (UniqueName: \"kubernetes.io/projected/db40a4a8-ce91-40a6-8b63-ccc17ed327da-kube-api-access-qgxv7\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.308834 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.308889 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-combined-ca-bundle\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.308917 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config-secret\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.411024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgxv7\" (UniqueName: \"kubernetes.io/projected/db40a4a8-ce91-40a6-8b63-ccc17ed327da-kube-api-access-qgxv7\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.411113 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.411168 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-combined-ca-bundle\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.411195 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config-secret\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.411979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.415930 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config-secret\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.418053 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-combined-ca-bundle\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.427612 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgxv7\" (UniqueName: \"kubernetes.io/projected/db40a4a8-ce91-40a6-8b63-ccc17ed327da-kube-api-access-qgxv7\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.472276 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.957542 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.974529 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"db40a4a8-ce91-40a6-8b63-ccc17ed327da","Type":"ContainerStarted","Data":"4866618d875b6b078a88700414f5169eaaf32dbde4a7d1b35c3fd383e5744baf"} Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.976932 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.977545 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerStarted","Data":"42634da366d0324b3faac04253eb83641574ae12f3e9cc409177c836453b0cb7"} Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.980123 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" podUID="db40a4a8-ce91-40a6-8b63-ccc17ed327da" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.999269 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.99925412 podStartE2EDuration="3.99925412s" podCreationTimestamp="2026-01-27 11:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:24.998300804 +0000 UTC m=+1084.139898581" watchObservedRunningTime="2026-01-27 11:38:24.99925412 +0000 UTC m=+1084.140851897" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.012368 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.131920 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle\") pod \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.131965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret\") pod \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.132096 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config\") pod \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.132150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqdqk\" (UniqueName: \"kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk\") pod \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.134224 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" (UID: "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.137114 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" (UID: "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.144100 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk" (OuterVolumeSpecName: "kube-api-access-bqdqk") pod "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" (UID: "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af"). InnerVolumeSpecName "kube-api-access-bqdqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.144109 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" (UID: "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.233817 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.233846 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqdqk\" (UniqueName: \"kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.233858 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.233866 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.254550 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.754988 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" path="/var/lib/kubelet/pods/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af/volumes" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.984209 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.990521 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" podUID="db40a4a8-ce91-40a6-8b63-ccc17ed327da" Jan 27 11:38:27 crc kubenswrapper[4775]: I0127 11:38:27.318364 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.127247 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.127556 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-central-agent" containerID="cri-o://c9b6a0c545f10363ab83ee451af24f75b0c3422868d2657358c693fd0f9f4e66" gracePeriod=30 Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.127695 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="proxy-httpd" containerID="cri-o://757f8cda3b6f903a401192990356764bd59a5026006946e21249f4fd71282e30" gracePeriod=30 Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.127772 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="sg-core" containerID="cri-o://004b1d31e12b92a12b6611a9cd3172251cdec0a27132ad6e5347a1433fe5b67a" gracePeriod=30 Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.127823 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-notification-agent" containerID="cri-o://9df7ce8e17e4380ee4b7c55578b2dda866d82c6471224b6ea2cb8602d082c361" gracePeriod=30 Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.140317 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.166:3000/\": EOF" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.487735 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.489476 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.494379 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.496605 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.497037 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.512050 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598359 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598426 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc4bh\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598496 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598649 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598673 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598905 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598992 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.599016 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700591 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700635 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700687 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700731 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700808 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc4bh\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700828 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.701703 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.702840 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.707321 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.708191 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.708942 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.709615 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.715984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc4bh\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.719101 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.816516 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.058917 4775 generic.go:334] "Generic (PLEG): container finished" podID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerID="757f8cda3b6f903a401192990356764bd59a5026006946e21249f4fd71282e30" exitCode=0 Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.058949 4775 generic.go:334] "Generic (PLEG): container finished" podID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerID="004b1d31e12b92a12b6611a9cd3172251cdec0a27132ad6e5347a1433fe5b67a" exitCode=2 Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.058957 4775 generic.go:334] "Generic (PLEG): container finished" podID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerID="c9b6a0c545f10363ab83ee451af24f75b0c3422868d2657358c693fd0f9f4e66" exitCode=0 Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.058976 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerDied","Data":"757f8cda3b6f903a401192990356764bd59a5026006946e21249f4fd71282e30"} Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.059002 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerDied","Data":"004b1d31e12b92a12b6611a9cd3172251cdec0a27132ad6e5347a1433fe5b67a"} Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.059012 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerDied","Data":"c9b6a0c545f10363ab83ee451af24f75b0c3422868d2657358c693fd0f9f4e66"} Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.062498 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-p9q28"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.067242 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.099375 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-p9q28"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.175786 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-k4m7t"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.177058 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.197750 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k4m7t"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.207745 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6423-account-create-update-h7gvh"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.209022 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.214251 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.217705 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6423-account-create-update-h7gvh"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.217882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.218195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rbr9\" (UniqueName: \"kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319602 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319697 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcs7g\" (UniqueName: \"kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319725 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rbr9\" (UniqueName: \"kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319765 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319806 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319848 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27ln\" (UniqueName: \"kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.320590 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.338064 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tfv9j"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.339154 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.352368 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8850-account-create-update-bwmll"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.354122 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.358255 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.361039 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rbr9\" (UniqueName: \"kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.368148 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tfv9j"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.380639 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8850-account-create-update-bwmll"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.390063 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.420924 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.421004 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.421059 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c27ln\" (UniqueName: \"kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.421153 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcs7g\" (UniqueName: \"kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.421201 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbhts\" (UniqueName: \"kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.421236 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.422017 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.424295 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.444542 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27ln\" (UniqueName: \"kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.446709 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcs7g\" (UniqueName: \"kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.493642 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.517035 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.522656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.523004 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbhts\" (UniqueName: \"kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.523047 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.523703 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sb9c\" (UniqueName: \"kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.525589 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.537189 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.555232 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbhts\" (UniqueName: \"kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.564574 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8d66-account-create-update-qwzzn"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.566227 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.568943 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.581067 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8d66-account-create-update-qwzzn"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.627881 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sb9c\" (UniqueName: \"kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.628157 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.629360 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.653237 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sb9c\" (UniqueName: \"kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.704991 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.730639 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.730769 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg99s\" (UniqueName: \"kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.794047 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.833290 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.833417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg99s\" (UniqueName: \"kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.843789 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.872042 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg99s\" (UniqueName: \"kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.912473 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-p9q28"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.920530 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: W0127 11:38:29.935084 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod027dfac2_8504_46aa_9302_19df71441688.slice/crio-a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1 WatchSource:0}: Error finding container a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1: Status 404 returned error can't find the container with id a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1 Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.010830 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k4m7t"] Jan 27 11:38:30 crc kubenswrapper[4775]: W0127 11:38:30.032909 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47764d9e_0435_43b7_aa95_e0a7e0d8b9c1.slice/crio-86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36 WatchSource:0}: Error finding container 86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36: Status 404 returned error can't find the container with id 86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36 Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.070701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k4m7t" event={"ID":"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1","Type":"ContainerStarted","Data":"86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36"} Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.073784 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p9q28" event={"ID":"027dfac2-8504-46aa-9302-19df71441688","Type":"ContainerStarted","Data":"a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1"} Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.079957 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerStarted","Data":"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57"} Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.079998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerStarted","Data":"fa5db8a5c7621b855f9aee7c911007cac93d44ed2023e821a1db694da3d675fa"} Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.178180 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6423-account-create-update-h7gvh"] Jan 27 11:38:30 crc kubenswrapper[4775]: W0127 11:38:30.202563 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb03d69b1_c651_4b79_9ba1_581dc15737a6.slice/crio-8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc WatchSource:0}: Error finding container 8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc: Status 404 returned error can't find the container with id 8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.334399 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tfv9j"] Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.425074 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8850-account-create-update-bwmll"] Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.526764 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8d66-account-create-update-qwzzn"] Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.092987 4775 generic.go:334] "Generic (PLEG): container finished" podID="b03d69b1-c651-4b79-9ba1-581dc15737a6" containerID="4caff9acfbabff5d43e064a2dae71d1faf921323c384955f825a0b026f90243f" exitCode=0 Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.093250 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6423-account-create-update-h7gvh" event={"ID":"b03d69b1-c651-4b79-9ba1-581dc15737a6","Type":"ContainerDied","Data":"4caff9acfbabff5d43e064a2dae71d1faf921323c384955f825a0b026f90243f"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.093277 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6423-account-create-update-h7gvh" event={"ID":"b03d69b1-c651-4b79-9ba1-581dc15737a6","Type":"ContainerStarted","Data":"8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.100190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8850-account-create-update-bwmll" event={"ID":"aeed29be-d561-4bf4-bdc1-c180e1983a3c","Type":"ContainerStarted","Data":"d58491ed7d0755e66bb744d214205950b95159736589ca24555561234307e146"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.102335 4775 generic.go:334] "Generic (PLEG): container finished" podID="d6287027-2778-4115-b173-62b1600d0247" containerID="7ef3f2b53db6801d250b8f062a4c055cb74eb877a306cd9ed1f923e6a13337a5" exitCode=0 Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.102400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfv9j" event={"ID":"d6287027-2778-4115-b173-62b1600d0247","Type":"ContainerDied","Data":"7ef3f2b53db6801d250b8f062a4c055cb74eb877a306cd9ed1f923e6a13337a5"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.102426 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfv9j" event={"ID":"d6287027-2778-4115-b173-62b1600d0247","Type":"ContainerStarted","Data":"d47e760d11933ebe6cd27f4052b822d367fe82a52bdb02d413a13c9bc07bfd85"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.104945 4775 generic.go:334] "Generic (PLEG): container finished" podID="47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" containerID="f1da3c93241fe74774825dab64f2ef30084cf90829cd29690c1d5d1e607b82cf" exitCode=0 Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.105022 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k4m7t" event={"ID":"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1","Type":"ContainerDied","Data":"f1da3c93241fe74774825dab64f2ef30084cf90829cd29690c1d5d1e607b82cf"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.108763 4775 generic.go:334] "Generic (PLEG): container finished" podID="027dfac2-8504-46aa-9302-19df71441688" containerID="a3091380a3b190141025c92d1747551aef9bfe0d5a0a8fe21ec59422863e92d3" exitCode=0 Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.108841 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p9q28" event={"ID":"027dfac2-8504-46aa-9302-19df71441688","Type":"ContainerDied","Data":"a3091380a3b190141025c92d1747551aef9bfe0d5a0a8fe21ec59422863e92d3"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.134727 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerStarted","Data":"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.134993 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.135116 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.187629 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-55bc6945f7-5kkp2" podStartSLOduration=3.187611936 podStartE2EDuration="3.187611936s" podCreationTimestamp="2026-01-27 11:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:31.176952405 +0000 UTC m=+1090.318550182" watchObservedRunningTime="2026-01-27 11:38:31.187611936 +0000 UTC m=+1090.329209703" Jan 27 11:38:32 crc kubenswrapper[4775]: I0127 11:38:32.573768 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 27 11:38:33 crc kubenswrapper[4775]: I0127 11:38:33.154382 4775 generic.go:334] "Generic (PLEG): container finished" podID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerID="9df7ce8e17e4380ee4b7c55578b2dda866d82c6471224b6ea2cb8602d082c361" exitCode=0 Jan 27 11:38:33 crc kubenswrapper[4775]: I0127 11:38:33.154459 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerDied","Data":"9df7ce8e17e4380ee4b7c55578b2dda866d82c6471224b6ea2cb8602d082c361"} Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.118248 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.119753 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="cinder-scheduler" containerID="cri-o://bbb066bf267b9b4c21870b464097c872ce5e07c929ddc57dfd10b2d4417b3e8c" gracePeriod=30 Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.119849 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="probe" containerID="cri-o://42634da366d0324b3faac04253eb83641574ae12f3e9cc409177c836453b0cb7" gracePeriod=30 Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.175407 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.175654 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api-log" containerID="cri-o://dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374" gracePeriod=30 Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.175717 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api" containerID="cri-o://89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290" gracePeriod=30 Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.255130 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.181131 4775 generic.go:334] "Generic (PLEG): container finished" podID="29838f60-9966-4962-9842-b6010abc1468" containerID="dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374" exitCode=143 Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.181194 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerDied","Data":"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374"} Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.183646 4775 generic.go:334] "Generic (PLEG): container finished" podID="7287c167-2d78-4766-b072-0762f4c4d504" containerID="42634da366d0324b3faac04253eb83641574ae12f3e9cc409177c836453b0cb7" exitCode=0 Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.183682 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerDied","Data":"42634da366d0324b3faac04253eb83641574ae12f3e9cc409177c836453b0cb7"} Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.757952 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.845676 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.868052 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.882084 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.893377 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.898236 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts\") pod \"d6287027-2778-4115-b173-62b1600d0247\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.898338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbhts\" (UniqueName: \"kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts\") pod \"d6287027-2778-4115-b173-62b1600d0247\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.899401 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6287027-2778-4115-b173-62b1600d0247" (UID: "d6287027-2778-4115-b173-62b1600d0247"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.905923 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts" (OuterVolumeSpecName: "kube-api-access-fbhts") pod "d6287027-2778-4115-b173-62b1600d0247" (UID: "d6287027-2778-4115-b173-62b1600d0247"). InnerVolumeSpecName "kube-api-access-fbhts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000369 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000493 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5szj\" (UniqueName: \"kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000558 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts\") pod \"027dfac2-8504-46aa-9302-19df71441688\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000619 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcs7g\" (UniqueName: \"kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g\") pod \"b03d69b1-c651-4b79-9ba1-581dc15737a6\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000648 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000687 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000730 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000761 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000807 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c27ln\" (UniqueName: \"kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln\") pod \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001073 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts\") pod \"b03d69b1-c651-4b79-9ba1-581dc15737a6\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001120 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rbr9\" (UniqueName: \"kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9\") pod \"027dfac2-8504-46aa-9302-19df71441688\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001145 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "027dfac2-8504-46aa-9302-19df71441688" (UID: "027dfac2-8504-46aa-9302-19df71441688"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001163 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts\") pod \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001581 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001907 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.002224 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.002297 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.002357 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.002416 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbhts\" (UniqueName: \"kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.002496 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.003926 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj" (OuterVolumeSpecName: "kube-api-access-t5szj") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "kube-api-access-t5szj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.004822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9" (OuterVolumeSpecName: "kube-api-access-6rbr9") pod "027dfac2-8504-46aa-9302-19df71441688" (UID: "027dfac2-8504-46aa-9302-19df71441688"). InnerVolumeSpecName "kube-api-access-6rbr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.005229 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b03d69b1-c651-4b79-9ba1-581dc15737a6" (UID: "b03d69b1-c651-4b79-9ba1-581dc15737a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.005587 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" (UID: "47764d9e-0435-43b7-aa95-e0a7e0d8b9c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.006839 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts" (OuterVolumeSpecName: "scripts") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.007182 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g" (OuterVolumeSpecName: "kube-api-access-kcs7g") pod "b03d69b1-c651-4b79-9ba1-581dc15737a6" (UID: "b03d69b1-c651-4b79-9ba1-581dc15737a6"). InnerVolumeSpecName "kube-api-access-kcs7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.007219 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln" (OuterVolumeSpecName: "kube-api-access-c27ln") pod "47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" (UID: "47764d9e-0435-43b7-aa95-e0a7e0d8b9c1"). InnerVolumeSpecName "kube-api-access-c27ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.037245 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.076565 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.097915 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data" (OuterVolumeSpecName: "config-data") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103732 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103765 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103778 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103790 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c27ln\" (UniqueName: \"kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103804 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103815 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rbr9\" (UniqueName: \"kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103826 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103836 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103846 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5szj\" (UniqueName: \"kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103859 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcs7g\" (UniqueName: \"kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.194922 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k4m7t" event={"ID":"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1","Type":"ContainerDied","Data":"86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.194968 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.195031 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.200202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p9q28" event={"ID":"027dfac2-8504-46aa-9302-19df71441688","Type":"ContainerDied","Data":"a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.200250 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.200223 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.202001 4775 generic.go:334] "Generic (PLEG): container finished" podID="a18608c5-afda-4481-9c6d-a576dfd4d803" containerID="23b16c9948b130a40404980a7031b163bab9fc293057be41f8d97640f61ddc95" exitCode=0 Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.202089 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" event={"ID":"a18608c5-afda-4481-9c6d-a576dfd4d803","Type":"ContainerDied","Data":"23b16c9948b130a40404980a7031b163bab9fc293057be41f8d97640f61ddc95"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.202125 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" event={"ID":"a18608c5-afda-4481-9c6d-a576dfd4d803","Type":"ContainerStarted","Data":"3add38581e28b29cf3951dafb72991b84c4a1e1fef3b9052c7bf2dbc049b4e0c"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.204015 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.204020 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6423-account-create-update-h7gvh" event={"ID":"b03d69b1-c651-4b79-9ba1-581dc15737a6","Type":"ContainerDied","Data":"8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.204068 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.209916 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerDied","Data":"28b64b4fcdfe6c67d081958bb4e6c186a5ea1015e7bbc85f180d20d2234b064c"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.209976 4775 scope.go:117] "RemoveContainer" containerID="757f8cda3b6f903a401192990356764bd59a5026006946e21249f4fd71282e30" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.210099 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.221489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"db40a4a8-ce91-40a6-8b63-ccc17ed327da","Type":"ContainerStarted","Data":"2fdf573c5dbc3537a484b71651345a941aea5fdf62703cfc9098c6d6cbdcf0dc"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.226944 4775 generic.go:334] "Generic (PLEG): container finished" podID="aeed29be-d561-4bf4-bdc1-c180e1983a3c" containerID="3e39eecfe6e3fc9edcef832aba89c2b8bb839bad8f9d02052e6eb7c6e0e5266b" exitCode=0 Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.227022 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8850-account-create-update-bwmll" event={"ID":"aeed29be-d561-4bf4-bdc1-c180e1983a3c","Type":"ContainerDied","Data":"3e39eecfe6e3fc9edcef832aba89c2b8bb839bad8f9d02052e6eb7c6e0e5266b"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.233022 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfv9j" event={"ID":"d6287027-2778-4115-b173-62b1600d0247","Type":"ContainerDied","Data":"d47e760d11933ebe6cd27f4052b822d367fe82a52bdb02d413a13c9bc07bfd85"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.233069 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d47e760d11933ebe6cd27f4052b822d367fe82a52bdb02d413a13c9bc07bfd85" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.233117 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.262577 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.691530685 podStartE2EDuration="13.261252177s" podCreationTimestamp="2026-01-27 11:38:24 +0000 UTC" firstStartedPulling="2026-01-27 11:38:24.962343238 +0000 UTC m=+1084.103941015" lastFinishedPulling="2026-01-27 11:38:36.53206473 +0000 UTC m=+1095.673662507" observedRunningTime="2026-01-27 11:38:37.259230462 +0000 UTC m=+1096.400828249" watchObservedRunningTime="2026-01-27 11:38:37.261252177 +0000 UTC m=+1096.402849954" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.293311 4775 scope.go:117] "RemoveContainer" containerID="004b1d31e12b92a12b6611a9cd3172251cdec0a27132ad6e5347a1433fe5b67a" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.343869 4775 scope.go:117] "RemoveContainer" containerID="9df7ce8e17e4380ee4b7c55578b2dda866d82c6471224b6ea2cb8602d082c361" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.369495 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.375415 4775 scope.go:117] "RemoveContainer" containerID="c9b6a0c545f10363ab83ee451af24f75b0c3422868d2657358c693fd0f9f4e66" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.397553 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412042 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412439 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6287027-2778-4115-b173-62b1600d0247" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412473 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6287027-2778-4115-b173-62b1600d0247" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412484 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-notification-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412491 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-notification-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412501 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="sg-core" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412506 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="sg-core" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412571 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027dfac2-8504-46aa-9302-19df71441688" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412579 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="027dfac2-8504-46aa-9302-19df71441688" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412630 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="proxy-httpd" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412636 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="proxy-httpd" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412647 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03d69b1-c651-4b79-9ba1-581dc15737a6" containerName="mariadb-account-create-update" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412653 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03d69b1-c651-4b79-9ba1-581dc15737a6" containerName="mariadb-account-create-update" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412666 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-central-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412672 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-central-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412680 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412686 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412915 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6287027-2778-4115-b173-62b1600d0247" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412932 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-notification-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412944 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412956 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03d69b1-c651-4b79-9ba1-581dc15737a6" containerName="mariadb-account-create-update" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412970 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="027dfac2-8504-46aa-9302-19df71441688" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412980 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="proxy-httpd" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412989 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-central-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412997 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="sg-core" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.415662 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.417968 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.418743 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.434506 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.511977 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512014 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9z42\" (UniqueName: \"kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512048 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512117 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512135 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512163 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512187 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613353 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613404 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613465 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613597 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613621 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9z42\" (UniqueName: \"kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613661 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.615024 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.615894 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.618233 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.619208 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.627860 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.630481 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.633260 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9z42\" (UniqueName: \"kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.754954 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" path="/var/lib/kubelet/pods/f43a36d6-24df-43c5-9d20-aaa35c11f855/volumes" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.777008 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.125792 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.295234 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:38 crc kubenswrapper[4775]: W0127 11:38:38.356870 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod093516f4_3b85_4290_98c0_006f41e91129.slice/crio-6977770effb03f1e311c752b3dcdf9fe577bbfd90a405744eb50e6760440fc9d WatchSource:0}: Error finding container 6977770effb03f1e311c752b3dcdf9fe577bbfd90a405744eb50e6760440fc9d: Status 404 returned error can't find the container with id 6977770effb03f1e311c752b3dcdf9fe577bbfd90a405744eb50e6760440fc9d Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.737753 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.746714 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.752271 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.757664 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.827881 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.828520 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66f4cff584-s28fg" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-api" containerID="cri-o://4940cda0a55ac3bfa8b35deb3e51723cf26072d3cd145374c8d469bfb275193d" gracePeriod=30 Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.828678 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66f4cff584-s28fg" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-httpd" containerID="cri-o://a7a6a0a041650648d435f425352e57c5d669972574c1edc44a04c82383216931" gracePeriod=30 Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832045 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts\") pod \"a18608c5-afda-4481-9c6d-a576dfd4d803\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832094 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wz5s\" (UniqueName: \"kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832306 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832362 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832384 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832464 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832495 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sb9c\" (UniqueName: \"kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c\") pod \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832522 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts\") pod \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832557 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg99s\" (UniqueName: \"kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s\") pod \"a18608c5-afda-4481-9c6d-a576dfd4d803\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832595 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832646 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.834653 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.835357 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aeed29be-d561-4bf4-bdc1-c180e1983a3c" (UID: "aeed29be-d561-4bf4-bdc1-c180e1983a3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.839923 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.843551 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.844556 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs" (OuterVolumeSpecName: "logs") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.844798 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a18608c5-afda-4481-9c6d-a576dfd4d803" (UID: "a18608c5-afda-4481-9c6d-a576dfd4d803"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.865488 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.867822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s" (OuterVolumeSpecName: "kube-api-access-rg99s") pod "a18608c5-afda-4481-9c6d-a576dfd4d803" (UID: "a18608c5-afda-4481-9c6d-a576dfd4d803"). InnerVolumeSpecName "kube-api-access-rg99s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.870603 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts" (OuterVolumeSpecName: "scripts") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.874071 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s" (OuterVolumeSpecName: "kube-api-access-9wz5s") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "kube-api-access-9wz5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.875640 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c" (OuterVolumeSpecName: "kube-api-access-7sb9c") pod "aeed29be-d561-4bf4-bdc1-c180e1983a3c" (UID: "aeed29be-d561-4bf4-bdc1-c180e1983a3c"). InnerVolumeSpecName "kube-api-access-7sb9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.947958 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sb9c\" (UniqueName: \"kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.947987 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.947997 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg99s\" (UniqueName: \"kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948011 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948021 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948030 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948043 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wz5s\" (UniqueName: \"kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948188 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948198 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.979896 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.009154 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.010935 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data" (OuterVolumeSpecName: "config-data") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.045756 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.051032 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.051058 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.051068 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.051076 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.250404 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8850-account-create-update-bwmll" event={"ID":"aeed29be-d561-4bf4-bdc1-c180e1983a3c","Type":"ContainerDied","Data":"d58491ed7d0755e66bb744d214205950b95159736589ca24555561234307e146"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.250683 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d58491ed7d0755e66bb744d214205950b95159736589ca24555561234307e146" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.250428 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.255758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerStarted","Data":"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.255790 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerStarted","Data":"6977770effb03f1e311c752b3dcdf9fe577bbfd90a405744eb50e6760440fc9d"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.258084 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerID="a7a6a0a041650648d435f425352e57c5d669972574c1edc44a04c82383216931" exitCode=0 Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.258136 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerDied","Data":"a7a6a0a041650648d435f425352e57c5d669972574c1edc44a04c82383216931"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.260727 4775 generic.go:334] "Generic (PLEG): container finished" podID="7287c167-2d78-4766-b072-0762f4c4d504" containerID="bbb066bf267b9b4c21870b464097c872ce5e07c929ddc57dfd10b2d4417b3e8c" exitCode=0 Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.260773 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerDied","Data":"bbb066bf267b9b4c21870b464097c872ce5e07c929ddc57dfd10b2d4417b3e8c"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.262747 4775 generic.go:334] "Generic (PLEG): container finished" podID="29838f60-9966-4962-9842-b6010abc1468" containerID="89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290" exitCode=0 Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.262817 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerDied","Data":"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.262846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerDied","Data":"33dee4dc93223d68ed0c9843e6651623dd7c73f98dd4eee5700b9bc73cb6734c"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.262852 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.262863 4775 scope.go:117] "RemoveContainer" containerID="89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.279998 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.282318 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" event={"ID":"a18608c5-afda-4481-9c6d-a576dfd4d803","Type":"ContainerDied","Data":"3add38581e28b29cf3951dafb72991b84c4a1e1fef3b9052c7bf2dbc049b4e0c"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.282374 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3add38581e28b29cf3951dafb72991b84c4a1e1fef3b9052c7bf2dbc049b4e0c" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.296580 4775 scope.go:117] "RemoveContainer" containerID="dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.311541 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.324940 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.340389 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.341198 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.341281 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api" Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.341344 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api-log" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.341401 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api-log" Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.341487 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18608c5-afda-4481-9c6d-a576dfd4d803" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.341587 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18608c5-afda-4481-9c6d-a576dfd4d803" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.341678 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeed29be-d561-4bf4-bdc1-c180e1983a3c" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.341747 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeed29be-d561-4bf4-bdc1-c180e1983a3c" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.342014 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeed29be-d561-4bf4-bdc1-c180e1983a3c" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.342285 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.342355 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api-log" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.342437 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a18608c5-afda-4481-9c6d-a576dfd4d803" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.343620 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.347020 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.351179 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.352509 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.354149 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.354550 4775 scope.go:117] "RemoveContainer" containerID="89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290" Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.355035 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290\": container with ID starting with 89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290 not found: ID does not exist" containerID="89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.355062 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290"} err="failed to get container status \"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290\": rpc error: code = NotFound desc = could not find container \"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290\": container with ID starting with 89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290 not found: ID does not exist" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.355082 4775 scope.go:117] "RemoveContainer" containerID="dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374" Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.356402 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374\": container with ID starting with dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374 not found: ID does not exist" containerID="dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.356418 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374"} err="failed to get container status \"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374\": rpc error: code = NotFound desc = could not find container \"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374\": container with ID starting with dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374 not found: ID does not exist" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.457830 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d670312-cbe8-44de-8f6f-857772d2af05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.457879 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.457923 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.457957 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.457988 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.458050 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-scripts\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.458091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d670312-cbe8-44de-8f6f-857772d2af05-logs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.458140 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8trvc\" (UniqueName: \"kubernetes.io/projected/3d670312-cbe8-44de-8f6f-857772d2af05-kube-api-access-8trvc\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.458221 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560400 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560491 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560521 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560548 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-scripts\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560577 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d670312-cbe8-44de-8f6f-857772d2af05-logs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8trvc\" (UniqueName: \"kubernetes.io/projected/3d670312-cbe8-44de-8f6f-857772d2af05-kube-api-access-8trvc\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560654 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560695 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d670312-cbe8-44de-8f6f-857772d2af05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560712 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.573360 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d670312-cbe8-44de-8f6f-857772d2af05-logs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.574108 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d670312-cbe8-44de-8f6f-857772d2af05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.578421 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.583091 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.583911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.584288 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.586960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.596535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-scripts\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.598394 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8trvc\" (UniqueName: \"kubernetes.io/projected/3d670312-cbe8-44de-8f6f-857772d2af05-kube-api-access-8trvc\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.672775 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.758398 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29838f60-9966-4962-9842-b6010abc1468" path="/var/lib/kubelet/pods/29838f60-9966-4962-9842-b6010abc1468/volumes" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.820193 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.969376 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.969900 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.970095 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctt6g\" (UniqueName: \"kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.970299 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.970378 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.970421 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.971403 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.982344 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g" (OuterVolumeSpecName: "kube-api-access-ctt6g") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "kube-api-access-ctt6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.982346 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts" (OuterVolumeSpecName: "scripts") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.982706 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.052199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.072210 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.072244 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.072253 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.072263 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.072272 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctt6g\" (UniqueName: \"kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.104960 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data" (OuterVolumeSpecName: "config-data") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.163211 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.173946 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.301434 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerDied","Data":"bd8ae068132a0cde3e52d3eb2417624a42f137d7a8867511b824433d3a994398"} Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.301497 4775 scope.go:117] "RemoveContainer" containerID="42634da366d0324b3faac04253eb83641574ae12f3e9cc409177c836453b0cb7" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.301458 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.314850 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d670312-cbe8-44de-8f6f-857772d2af05","Type":"ContainerStarted","Data":"6eb38ee4612f38acec5d852feb5e8b04f871568e33480d91f8e13c85951dfadd"} Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.323240 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerStarted","Data":"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c"} Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.334595 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.345298 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.360105 4775 scope.go:117] "RemoveContainer" containerID="bbb066bf267b9b4c21870b464097c872ce5e07c929ddc57dfd10b2d4417b3e8c" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.370689 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:40 crc kubenswrapper[4775]: E0127 11:38:40.371107 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="cinder-scheduler" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.371118 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="cinder-scheduler" Jan 27 11:38:40 crc kubenswrapper[4775]: E0127 11:38:40.371141 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="probe" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.371147 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="probe" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.371311 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="cinder-scheduler" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.371323 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="probe" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.372223 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.380614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.381076 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.478354 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.478585 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/030ef7f1-5f79-42e9-800e-55c4f70964e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.478788 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.478873 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnc5k\" (UniqueName: \"kubernetes.io/projected/030ef7f1-5f79-42e9-800e-55c4f70964e5-kube-api-access-dnc5k\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.478964 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.479045 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.584184 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.584249 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnc5k\" (UniqueName: \"kubernetes.io/projected/030ef7f1-5f79-42e9-800e-55c4f70964e5-kube-api-access-dnc5k\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.584310 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.584639 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.585396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.585436 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/030ef7f1-5f79-42e9-800e-55c4f70964e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.585614 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/030ef7f1-5f79-42e9-800e-55c4f70964e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.590200 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.591379 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.598799 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.600609 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.608094 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnc5k\" (UniqueName: \"kubernetes.io/projected/030ef7f1-5f79-42e9-800e-55c4f70964e5-kube-api-access-dnc5k\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.749506 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:41 crc kubenswrapper[4775]: I0127 11:38:41.306714 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:41 crc kubenswrapper[4775]: W0127 11:38:41.318492 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod030ef7f1_5f79_42e9_800e_55c4f70964e5.slice/crio-9434333b7abc74560277574b36915b6dbdaf400b87b052334936ee09c6619e3d WatchSource:0}: Error finding container 9434333b7abc74560277574b36915b6dbdaf400b87b052334936ee09c6619e3d: Status 404 returned error can't find the container with id 9434333b7abc74560277574b36915b6dbdaf400b87b052334936ee09c6619e3d Jan 27 11:38:41 crc kubenswrapper[4775]: I0127 11:38:41.363988 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerStarted","Data":"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88"} Jan 27 11:38:41 crc kubenswrapper[4775]: I0127 11:38:41.367144 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"030ef7f1-5f79-42e9-800e-55c4f70964e5","Type":"ContainerStarted","Data":"9434333b7abc74560277574b36915b6dbdaf400b87b052334936ee09c6619e3d"} Jan 27 11:38:41 crc kubenswrapper[4775]: I0127 11:38:41.369881 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d670312-cbe8-44de-8f6f-857772d2af05","Type":"ContainerStarted","Data":"58702b3569f40cd06e51a8afa2d27fff5a3e1b8bcd993716870b1819390a0075"} Jan 27 11:38:41 crc kubenswrapper[4775]: I0127 11:38:41.761746 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7287c167-2d78-4766-b072-0762f4c4d504" path="/var/lib/kubelet/pods/7287c167-2d78-4766-b072-0762f4c4d504/volumes" Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.397873 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"030ef7f1-5f79-42e9-800e-55c4f70964e5","Type":"ContainerStarted","Data":"f1971861217a880e224af0a1c5a5cefad7b0c93edf4ecdc4d5d6b7ea42934acd"} Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.411597 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d670312-cbe8-44de-8f6f-857772d2af05","Type":"ContainerStarted","Data":"538331ab0712b0add08c5b63361b93aed077ba69f583f33736f2ce481499e323"} Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.411718 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.431883 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.431867083 podStartE2EDuration="3.431867083s" podCreationTimestamp="2026-01-27 11:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:42.425606902 +0000 UTC m=+1101.567204689" watchObservedRunningTime="2026-01-27 11:38:42.431867083 +0000 UTC m=+1101.573464860" Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.437684 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerStarted","Data":"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32"} Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.437853 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-central-agent" containerID="cri-o://9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46" gracePeriod=30 Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.437960 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.438263 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="proxy-httpd" containerID="cri-o://7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32" gracePeriod=30 Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.438304 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="sg-core" containerID="cri-o://43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88" gracePeriod=30 Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.438340 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-notification-agent" containerID="cri-o://24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c" gracePeriod=30 Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.457077 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.949702099 podStartE2EDuration="5.457063744s" podCreationTimestamp="2026-01-27 11:38:37 +0000 UTC" firstStartedPulling="2026-01-27 11:38:38.369722785 +0000 UTC m=+1097.511320562" lastFinishedPulling="2026-01-27 11:38:41.87708444 +0000 UTC m=+1101.018682207" observedRunningTime="2026-01-27 11:38:42.456227061 +0000 UTC m=+1101.597824838" watchObservedRunningTime="2026-01-27 11:38:42.457063744 +0000 UTC m=+1101.598661521" Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.449609 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"030ef7f1-5f79-42e9-800e-55c4f70964e5","Type":"ContainerStarted","Data":"f8c39efe4d2dcc34b627df41a48948ef3ee1ac1a734fb7f5acd3072575dd0fc5"} Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.458063 4775 generic.go:334] "Generic (PLEG): container finished" podID="093516f4-3b85-4290-98c0-006f41e91129" containerID="7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32" exitCode=0 Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.458107 4775 generic.go:334] "Generic (PLEG): container finished" podID="093516f4-3b85-4290-98c0-006f41e91129" containerID="43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88" exitCode=2 Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.458121 4775 generic.go:334] "Generic (PLEG): container finished" podID="093516f4-3b85-4290-98c0-006f41e91129" containerID="24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c" exitCode=0 Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.459186 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerDied","Data":"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32"} Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.459227 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerDied","Data":"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88"} Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.459242 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerDied","Data":"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c"} Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.477242 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.477224951 podStartE2EDuration="3.477224951s" podCreationTimestamp="2026-01-27 11:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:43.475817402 +0000 UTC m=+1102.617415189" watchObservedRunningTime="2026-01-27 11:38:43.477224951 +0000 UTC m=+1102.618822728" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.472224 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerID="4940cda0a55ac3bfa8b35deb3e51723cf26072d3cd145374c8d469bfb275193d" exitCode=0 Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.472287 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerDied","Data":"4940cda0a55ac3bfa8b35deb3e51723cf26072d3cd145374c8d469bfb275193d"} Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.853287 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bh7g"] Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.854616 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.856250 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.856578 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kp5gz" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.864042 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.866706 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bh7g"] Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.871193 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976319 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle\") pod \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976365 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxk8k\" (UniqueName: \"kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k\") pod \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976474 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config\") pod \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976508 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config\") pod \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976608 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs\") pod \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976817 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976855 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976899 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976980 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbw4n\" (UniqueName: \"kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.996003 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k" (OuterVolumeSpecName: "kube-api-access-xxk8k") pod "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" (UID: "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed"). InnerVolumeSpecName "kube-api-access-xxk8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.010004 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" (UID: "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.043915 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config" (OuterVolumeSpecName: "config") pod "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" (UID: "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.046674 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" (UID: "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.076402 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" (UID: "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078311 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078372 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078521 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbw4n\" (UniqueName: \"kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078584 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078603 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078614 4775 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078626 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078635 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxk8k\" (UniqueName: \"kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.081594 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.081941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.082964 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.094784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbw4n\" (UniqueName: \"kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.181521 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.253558 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.253732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.481687 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerDied","Data":"98a20e3bbe057f1a1083416d0cff14282fdc9e2fca7261f4540fdf9a82145994"} Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.481766 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.482031 4775 scope.go:117] "RemoveContainer" containerID="a7a6a0a041650648d435f425352e57c5d669972574c1edc44a04c82383216931" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.531110 4775 scope.go:117] "RemoveContainer" containerID="4940cda0a55ac3bfa8b35deb3e51723cf26072d3cd145374c8d469bfb275193d" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.533830 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.545160 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:38:45 crc kubenswrapper[4775]: W0127 11:38:45.629310 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b5e7b0a_a4d0_4c64_b273_2b47230efd17.slice/crio-d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0 WatchSource:0}: Error finding container d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0: Status 404 returned error can't find the container with id d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0 Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.631781 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bh7g"] Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.757529 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" path="/var/lib/kubelet/pods/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed/volumes" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.758763 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 27 11:38:46 crc kubenswrapper[4775]: I0127 11:38:46.501263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" event={"ID":"1b5e7b0a-a4d0-4c64-b273-2b47230efd17","Type":"ContainerStarted","Data":"d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0"} Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.074793 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-66648b46df-hskmp"] Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.075167 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-api" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.075182 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-api" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.075198 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.075205 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.075368 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-api" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.075385 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.076255 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.096231 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-66648b46df-hskmp"] Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.101808 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.231744 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.231814 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.231956 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.231985 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9z42\" (UniqueName: \"kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232032 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232070 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232304 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-internal-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232367 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x89qz\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-kube-api-access-x89qz\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232390 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-run-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232410 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-combined-ca-bundle\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232466 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-config-data\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-etc-swift\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232526 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-public-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232574 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-log-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.233425 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.234349 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.238600 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts" (OuterVolumeSpecName: "scripts") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.238783 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42" (OuterVolumeSpecName: "kube-api-access-s9z42") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "kube-api-access-s9z42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.262851 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.332965 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334042 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334277 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-config-data\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-etc-swift\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-public-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334412 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-log-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334465 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-internal-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334518 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x89qz\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-kube-api-access-x89qz\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334542 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-run-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334561 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-combined-ca-bundle\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334637 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334649 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334658 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9z42\" (UniqueName: \"kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334668 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334676 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.335829 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-log-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: W0127 11:38:47.337135 4775 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/093516f4-3b85-4290-98c0-006f41e91129/volumes/kubernetes.io~secret/combined-ca-bundle Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.337181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.338863 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-run-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.339853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-internal-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.339892 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-combined-ca-bundle\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.341441 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-config-data\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.343466 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-etc-swift\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.348985 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-public-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.354480 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x89qz\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-kube-api-access-x89qz\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.371612 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data" (OuterVolumeSpecName: "config-data") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.406682 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.438873 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.438914 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.520189 4775 generic.go:334] "Generic (PLEG): container finished" podID="093516f4-3b85-4290-98c0-006f41e91129" containerID="9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46" exitCode=0 Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.520228 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerDied","Data":"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46"} Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.520253 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerDied","Data":"6977770effb03f1e311c752b3dcdf9fe577bbfd90a405744eb50e6760440fc9d"} Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.520269 4775 scope.go:117] "RemoveContainer" containerID="7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.520269 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.552935 4775 scope.go:117] "RemoveContainer" containerID="43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.562363 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.577819 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586015 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.586392 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-central-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586407 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-central-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.586435 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-notification-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586441 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-notification-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.586470 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="proxy-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586477 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="proxy-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.586487 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="sg-core" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586493 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="sg-core" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586643 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="proxy-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586655 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-notification-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586672 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-central-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586681 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="sg-core" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.588273 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.592300 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.592643 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.593881 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.604550 4775 scope.go:117] "RemoveContainer" containerID="24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.638105 4775 scope.go:117] "RemoveContainer" containerID="9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.683088 4775 scope.go:117] "RemoveContainer" containerID="7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.683707 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32\": container with ID starting with 7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32 not found: ID does not exist" containerID="7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.683763 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32"} err="failed to get container status \"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32\": rpc error: code = NotFound desc = could not find container \"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32\": container with ID starting with 7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32 not found: ID does not exist" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.683852 4775 scope.go:117] "RemoveContainer" containerID="43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.684598 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88\": container with ID starting with 43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88 not found: ID does not exist" containerID="43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.684628 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88"} err="failed to get container status \"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88\": rpc error: code = NotFound desc = could not find container \"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88\": container with ID starting with 43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88 not found: ID does not exist" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.684646 4775 scope.go:117] "RemoveContainer" containerID="24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.684891 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c\": container with ID starting with 24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c not found: ID does not exist" containerID="24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.684975 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c"} err="failed to get container status \"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c\": rpc error: code = NotFound desc = could not find container \"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c\": container with ID starting with 24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c not found: ID does not exist" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.685001 4775 scope.go:117] "RemoveContainer" containerID="9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.685586 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46\": container with ID starting with 9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46 not found: ID does not exist" containerID="9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.685630 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46"} err="failed to get container status \"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46\": rpc error: code = NotFound desc = could not find container \"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46\": container with ID starting with 9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46 not found: ID does not exist" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748602 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748695 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748741 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748788 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvx4h\" (UniqueName: \"kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748851 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748893 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748928 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.767291 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093516f4-3b85-4290-98c0-006f41e91129" path="/var/lib/kubelet/pods/093516f4-3b85-4290-98c0-006f41e91129/volumes" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.849945 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.850007 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.850052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.850518 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.850867 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.850934 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvx4h\" (UniqueName: \"kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.851018 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.851085 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.851535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.857957 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.858427 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.859834 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.860718 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.873063 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvx4h\" (UniqueName: \"kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.915010 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.948520 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-66648b46df-hskmp"] Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.378243 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.533513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66648b46df-hskmp" event={"ID":"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5","Type":"ContainerStarted","Data":"cc40f7435155744fb8655a00a8bcfac37f639ca67526c1a7fb8e190dfcda6662"} Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.533565 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66648b46df-hskmp" event={"ID":"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5","Type":"ContainerStarted","Data":"f44610c4492d47323f8fc309fa651e0d922d1202d7775de5462e1c61c3e2c2b0"} Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.533576 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66648b46df-hskmp" event={"ID":"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5","Type":"ContainerStarted","Data":"dabb6eafd56ad25cc1d4be4c65aba691db7d53364e22e215df7a739017b279d4"} Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.533721 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.534880 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerStarted","Data":"8683e19a7bcd30570af286ce01224a28b785c454609defaa562ddd8aa8e80071"} Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.558520 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-66648b46df-hskmp" podStartSLOduration=1.558501506 podStartE2EDuration="1.558501506s" podCreationTimestamp="2026-01-27 11:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:48.55213073 +0000 UTC m=+1107.693728517" watchObservedRunningTime="2026-01-27 11:38:48.558501506 +0000 UTC m=+1107.700099283" Jan 27 11:38:49 crc kubenswrapper[4775]: I0127 11:38:49.547408 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerStarted","Data":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} Jan 27 11:38:49 crc kubenswrapper[4775]: I0127 11:38:49.547773 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:50 crc kubenswrapper[4775]: I0127 11:38:50.558282 4775 generic.go:334] "Generic (PLEG): container finished" podID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerID="b63cf0e89854369b83ebb263e9838c2cb8b2524c2ff119bacd1526747a2980ff" exitCode=137 Jan 27 11:38:50 crc kubenswrapper[4775]: I0127 11:38:50.559331 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerDied","Data":"b63cf0e89854369b83ebb263e9838c2cb8b2524c2ff119bacd1526747a2980ff"} Jan 27 11:38:50 crc kubenswrapper[4775]: I0127 11:38:50.957612 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 27 11:38:51 crc kubenswrapper[4775]: I0127 11:38:51.635862 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.149359 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.151440 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.188070 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.189659 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.197676 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.207337 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.269965 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270018 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvqr2\" (UniqueName: \"kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270066 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zn9s\" (UniqueName: \"kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270159 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270184 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270254 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270296 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270322 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270340 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.296567 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.304278 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.354677 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371483 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371548 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371576 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371609 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371634 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371681 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371719 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371740 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvqr2\" (UniqueName: \"kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371760 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371788 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371824 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zn9s\" (UniqueName: \"kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371856 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371883 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371909 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqt5z\" (UniqueName: \"kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.372382 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.375461 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.377766 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.378676 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.379221 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.383230 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.391628 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.401399 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zn9s\" (UniqueName: \"kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.405554 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvqr2\" (UniqueName: \"kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.412045 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.470298 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.474520 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.474638 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.474752 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.474842 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqt5z\" (UniqueName: \"kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.474934 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.475064 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.475181 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.475784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.479847 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.479911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.481401 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.484051 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.488359 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.501026 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqt5z\" (UniqueName: \"kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.516015 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.625289 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.667362 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.667728 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-log" containerID="cri-o://164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b" gracePeriod=30 Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.668236 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-httpd" containerID="cri-o://1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b" gracePeriod=30 Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.748719 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.253629 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.528609 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.544191 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.578547 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5bd6cd4f4f-kxhrc"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.580417 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.594520 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78f66698d-fbfmx"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.596315 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.609920 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5bd6cd4f4f-kxhrc"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.622198 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78f66698d-fbfmx"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.654399 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.664296 4775 generic.go:334] "Generic (PLEG): container finished" podID="b138b14c-964d-465d-a534-c7aff1633e76" containerID="164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b" exitCode=143 Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.664339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerDied","Data":"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b"} Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.678228 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d66b74d76-ngwn9"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.679817 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.688734 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d66b74d76-ngwn9"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698051 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pckjb\" (UniqueName: \"kubernetes.io/projected/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-kube-api-access-pckjb\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-combined-ca-bundle\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698306 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1138f75c-8e56-4a32-8110-8b26d9f80688-logs\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698388 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698487 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-combined-ca-bundle\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698826 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698905 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn24v\" (UniqueName: \"kubernetes.io/projected/1138f75c-8e56-4a32-8110-8b26d9f80688-kube-api-access-zn24v\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.699025 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-logs\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.699068 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data-custom\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.699096 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data-custom\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800521 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-internal-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800565 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data-custom\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800586 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data-custom\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800611 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-public-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800628 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t772l\" (UniqueName: \"kubernetes.io/projected/8fa6c814-723c-4638-8ae9-dbb9f6864120-kube-api-access-t772l\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pckjb\" (UniqueName: \"kubernetes.io/projected/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-kube-api-access-pckjb\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800683 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data-custom\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-combined-ca-bundle\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800735 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1138f75c-8e56-4a32-8110-8b26d9f80688-logs\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800761 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800784 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-combined-ca-bundle\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800805 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-combined-ca-bundle\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800830 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa6c814-723c-4638-8ae9-dbb9f6864120-logs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800914 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn24v\" (UniqueName: \"kubernetes.io/projected/1138f75c-8e56-4a32-8110-8b26d9f80688-kube-api-access-zn24v\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800936 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800967 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-logs\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.801437 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-logs\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.802362 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1138f75c-8e56-4a32-8110-8b26d9f80688-logs\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.806608 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-combined-ca-bundle\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.807063 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data-custom\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.808854 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-combined-ca-bundle\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.809552 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.818097 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data-custom\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.819410 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn24v\" (UniqueName: \"kubernetes.io/projected/1138f75c-8e56-4a32-8110-8b26d9f80688-kube-api-access-zn24v\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.819899 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.821811 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pckjb\" (UniqueName: \"kubernetes.io/projected/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-kube-api-access-pckjb\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.901956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data-custom\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902049 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-combined-ca-bundle\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902083 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa6c814-723c-4638-8ae9-dbb9f6864120-logs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902154 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902219 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-internal-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902255 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-public-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902268 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t772l\" (UniqueName: \"kubernetes.io/projected/8fa6c814-723c-4638-8ae9-dbb9f6864120-kube-api-access-t772l\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.903713 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa6c814-723c-4638-8ae9-dbb9f6864120-logs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.905903 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data-custom\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.906679 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-internal-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.907193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-combined-ca-bundle\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.907569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.909657 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-public-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.918074 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.922059 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t772l\" (UniqueName: \"kubernetes.io/projected/8fa6c814-723c-4638-8ae9-dbb9f6864120-kube-api-access-t772l\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.933196 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.002700 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.117994 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.118288 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-log" containerID="cri-o://8ece19255413b1f459b9b434879cd49c181c9d1e505f96017ef83628747fdd1b" gracePeriod=30 Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.118349 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-httpd" containerID="cri-o://2f5a6906cc8f471f0d04ad0bdc4a6f5a9284f2bae71c74883779afada2270d60" gracePeriod=30 Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.124809 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.206873 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.206990 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207024 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207093 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs2dr\" (UniqueName: \"kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207125 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207173 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207650 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs" (OuterVolumeSpecName: "logs") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207904 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.213961 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr" (OuterVolumeSpecName: "kube-api-access-xs2dr") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "kube-api-access-xs2dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.214279 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.235694 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data" (OuterVolumeSpecName: "config-data") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.236052 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.250806 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts" (OuterVolumeSpecName: "scripts") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.275966 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310256 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310622 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310637 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310650 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs2dr\" (UniqueName: \"kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310664 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310675 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.691645 4775 generic.go:334] "Generic (PLEG): container finished" podID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerID="8ece19255413b1f459b9b434879cd49c181c9d1e505f96017ef83628747fdd1b" exitCode=143 Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.692316 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerDied","Data":"8ece19255413b1f459b9b434879cd49c181c9d1e505f96017ef83628747fdd1b"} Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.699205 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerStarted","Data":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.700641 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" event={"ID":"1b5e7b0a-a4d0-4c64-b273-2b47230efd17","Type":"ContainerStarted","Data":"cd7130b87032009eafbd9299811458b2c0b7a08141bac0e7bfbe791fc49ad4d0"} Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.703247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerDied","Data":"67af1fcb0bcad60b4d6220dc2a58636c77413c902e1d5d58f9a296545b8c138a"} Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.703297 4775 scope.go:117] "RemoveContainer" containerID="0eb18ea0a7e8522aa14ee450ec18f20609f48386c58320c99cc54df7dfbb3f2d" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.703502 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.703881 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5bd6cd4f4f-kxhrc"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.732705 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" podStartSLOduration=2.049613738 podStartE2EDuration="12.732691172s" podCreationTimestamp="2026-01-27 11:38:44 +0000 UTC" firstStartedPulling="2026-01-27 11:38:45.63904984 +0000 UTC m=+1104.780647617" lastFinishedPulling="2026-01-27 11:38:56.322127274 +0000 UTC m=+1115.463725051" observedRunningTime="2026-01-27 11:38:56.71203122 +0000 UTC m=+1115.853628997" watchObservedRunningTime="2026-01-27 11:38:56.732691172 +0000 UTC m=+1115.874288949" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.779587 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.788048 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.922087 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.931882 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.978197 4775 scope.go:117] "RemoveContainer" containerID="b63cf0e89854369b83ebb263e9838c2cb8b2524c2ff119bacd1526747a2980ff" Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.107321 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d66b74d76-ngwn9"] Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.313249 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78f66698d-fbfmx"] Jan 27 11:38:57 crc kubenswrapper[4775]: W0127 11:38:57.313317 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9862a859_ad75_4071_ad9a_ec926175e46d.slice/crio-c2a8847ef3756637a0ac2e98b536e6dfeb366c6e1256763e5e2606e3b7895d3a WatchSource:0}: Error finding container c2a8847ef3756637a0ac2e98b536e6dfeb366c6e1256763e5e2606e3b7895d3a: Status 404 returned error can't find the container with id c2a8847ef3756637a0ac2e98b536e6dfeb366c6e1256763e5e2606e3b7895d3a Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.344551 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.418184 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.419691 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.506430 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.512027 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-55bc6945f7-5kkp2" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-httpd" containerID="cri-o://3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" gracePeriod=30 Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.515902 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-55bc6945f7-5kkp2" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-server" containerID="cri-o://b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" gracePeriod=30 Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.741551 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerStarted","Data":"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.741666 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerStarted","Data":"94dd9f79f758d901a5bff17b96dea4bc02bd0921b66a706ae353879746b66d0f"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.743040 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerStarted","Data":"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.743070 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerStarted","Data":"332dd1d5955a659196a69a4a345219a2406c5c86fb913a32323480dc0fd29f46"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.776444 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" path="/var/lib/kubelet/pods/98c20582-df9c-4ed1-8c42-0d5d1783e6f4/volumes" Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.777511 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" event={"ID":"8874fbc9-9d42-45dd-b38b-9ba1a33340f5","Type":"ContainerStarted","Data":"9b90bab51b264bc1493dfe140fad9990815019bebc9bcefad683cec6ac00649d"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.777660 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" event={"ID":"8874fbc9-9d42-45dd-b38b-9ba1a33340f5","Type":"ContainerStarted","Data":"65d8296d1a0a5c8204814b2dd9d5aae0c11dfced24f191ff27f67fd58d52aaea"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.777791 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" event={"ID":"8874fbc9-9d42-45dd-b38b-9ba1a33340f5","Type":"ContainerStarted","Data":"335fcf56c5d7788d038c75a619f299a0f6b0c61a4fa38fe9339e8278584adfc0"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.783584 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" podStartSLOduration=2.783565959 podStartE2EDuration="2.783565959s" podCreationTimestamp="2026-01-27 11:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:57.780254228 +0000 UTC m=+1116.921852005" watchObservedRunningTime="2026-01-27 11:38:57.783565959 +0000 UTC m=+1116.925163736" Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.785989 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d66b74d76-ngwn9" event={"ID":"8fa6c814-723c-4638-8ae9-dbb9f6864120","Type":"ContainerStarted","Data":"d6039f83caf53b93ea687a21a178213a03ac82cd1ad840bff24a9e7dff45e91b"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.786036 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d66b74d76-ngwn9" event={"ID":"8fa6c814-723c-4638-8ae9-dbb9f6864120","Type":"ContainerStarted","Data":"3080ba822c9c02a27ac7c0df05b5563a7b1ff6396ab1cf4bf9aed34ec048e87b"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.787489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" event={"ID":"1138f75c-8e56-4a32-8110-8b26d9f80688","Type":"ContainerStarted","Data":"cbeedb745ed20cd5c9e088a7e51f6dc279f1d562a3587ad830f0e89adee9d852"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.795218 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerStarted","Data":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.798784 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerStarted","Data":"c2a8847ef3756637a0ac2e98b536e6dfeb366c6e1256763e5e2606e3b7895d3a"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.835084 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.835345 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-695f7dfd45-zbb58" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker-log" containerID="cri-o://156c73760afe4bfaf528d085e9a2fb00e063fb27928a61dc8179d4c23fd740db" gracePeriod=30 Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.835767 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-695f7dfd45-zbb58" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker" containerID="cri-o://42504908b6e8629c4bfd13d446379584c5e9631e5f21f9d0d03ceb47fe02eefd" gracePeriod=30 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.353248 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:38:58 crc kubenswrapper[4775]: E0127 11:38:58.443720 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe6b32f3_f53f_43ba_a349_2f00d5e657d0.slice/crio-conmon-b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe6b32f3_f53f_43ba_a349_2f00d5e657d0.slice/crio-b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495068 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495411 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2wkr\" (UniqueName: \"kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495488 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495519 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495573 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495615 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495630 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495660 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.497110 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs" (OuterVolumeSpecName: "logs") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.497379 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.503547 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.511753 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr" (OuterVolumeSpecName: "kube-api-access-w2wkr") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "kube-api-access-w2wkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.540397 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts" (OuterVolumeSpecName: "scripts") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.597404 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.597430 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.597439 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.597482 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.597496 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2wkr\" (UniqueName: \"kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.622582 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.634619 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data" (OuterVolumeSpecName: "config-data") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.642949 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.643299 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.701167 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.701202 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.701213 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.701224 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.819998 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.839210 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerStarted","Data":"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.839387 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5897cf85c8-ppd2f" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api-log" containerID="cri-o://c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" gracePeriod=30 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.839637 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.839667 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.839696 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5897cf85c8-ppd2f" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api" containerID="cri-o://a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" gracePeriod=30 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.875896 4775 generic.go:334] "Generic (PLEG): container finished" podID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerID="156c73760afe4bfaf528d085e9a2fb00e063fb27928a61dc8179d4c23fd740db" exitCode=143 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.875974 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-695f7dfd45-zbb58" event={"ID":"ac6a9582-6a97-46b4-aa84-35ca9abe695c","Type":"ContainerDied","Data":"156c73760afe4bfaf528d085e9a2fb00e063fb27928a61dc8179d4c23fd740db"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.890089 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d66b74d76-ngwn9" event={"ID":"8fa6c814-723c-4638-8ae9-dbb9f6864120","Type":"ContainerStarted","Data":"f18c44ffc3fe1dd758152a1e96e9f0872974147028a0adf96b9ca33df41bef76"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.891199 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.891225 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.903881 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.903968 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904029 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904107 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc4bh\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904158 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904208 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904229 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904287 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.906402 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910161 4775 generic.go:334] "Generic (PLEG): container finished" podID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerID="b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" exitCode=0 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910193 4775 generic.go:334] "Generic (PLEG): container finished" podID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerID="3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" exitCode=0 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910254 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerDied","Data":"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910280 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerDied","Data":"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910290 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerDied","Data":"fa5db8a5c7621b855f9aee7c911007cac93d44ed2023e821a1db694da3d675fa"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910305 4775 scope.go:117] "RemoveContainer" containerID="b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.912204 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.924776 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.943758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" event={"ID":"1138f75c-8e56-4a32-8110-8b26d9f80688","Type":"ContainerStarted","Data":"c7cf618249468439ab9947bd79c2502d8d395f3ad8e3cc7a54d72b00a23938fe"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.943812 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" event={"ID":"1138f75c-8e56-4a32-8110-8b26d9f80688","Type":"ContainerStarted","Data":"3d2983e2073a6e23919993569a2db1e777666c79d856b0a59c0ce0ca9ce6a54e"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.969141 4775 generic.go:334] "Generic (PLEG): container finished" podID="b138b14c-964d-465d-a534-c7aff1633e76" containerID="1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b" exitCode=0 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.969204 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerDied","Data":"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.969232 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerDied","Data":"4be346d9744f80cbe9acdb090392b9c63c5e0cb6ed893fe6b3ae4a4e7c97ad5e"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.969287 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.977668 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh" (OuterVolumeSpecName: "kube-api-access-fc4bh") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "kube-api-access-fc4bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.977781 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.978485 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5897cf85c8-ppd2f" podStartSLOduration=4.978468775 podStartE2EDuration="4.978468775s" podCreationTimestamp="2026-01-27 11:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:58.876166593 +0000 UTC m=+1118.017764360" watchObservedRunningTime="2026-01-27 11:38:58.978468775 +0000 UTC m=+1118.120066552" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.993236 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerStarted","Data":"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.993280 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerStarted","Data":"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.993374 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6695647446-72d6k" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener-log" containerID="cri-o://9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3" gracePeriod=30 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.993644 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6695647446-72d6k" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener" containerID="cri-o://5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba" gracePeriod=30 Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.004839 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d66b74d76-ngwn9" podStartSLOduration=4.004819805 podStartE2EDuration="4.004819805s" podCreationTimestamp="2026-01-27 11:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:58.91763095 +0000 UTC m=+1118.059228727" watchObservedRunningTime="2026-01-27 11:38:59.004819805 +0000 UTC m=+1118.146417582" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.010149 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc4bh\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.010180 4775 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.010192 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.010203 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.012959 4775 scope.go:117] "RemoveContainer" containerID="3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.016000 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker-log" containerID="cri-o://c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63" gracePeriod=30 Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.016183 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerStarted","Data":"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f"} Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.016213 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker" containerID="cri-o://7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f" gracePeriod=30 Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.039122 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" podStartSLOduration=4.039105354 podStartE2EDuration="4.039105354s" podCreationTimestamp="2026-01-27 11:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:58.96416745 +0000 UTC m=+1118.105765227" watchObservedRunningTime="2026-01-27 11:38:59.039105354 +0000 UTC m=+1118.180703131" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.046083 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.051290 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data" (OuterVolumeSpecName: "config-data") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.081558 4775 scope.go:117] "RemoveContainer" containerID="b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.082520 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.082826 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener-log" containerID="cri-o://0fa47ced9f0a1a66931599424fb0e02e42c9c45fd055acdeb51c078cfec19eb2" gracePeriod=30 Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.082964 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener" containerID="cri-o://9d13207bfa59faf596deb2d40a70b14097428a29e9cd2f29e431ec69fafe695f" gracePeriod=30 Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.085055 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.085164 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1\": container with ID starting with b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1 not found: ID does not exist" containerID="b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.085198 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1"} err="failed to get container status \"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1\": rpc error: code = NotFound desc = could not find container \"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1\": container with ID starting with b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1 not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.085225 4775 scope.go:117] "RemoveContainer" containerID="3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.096607 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.101640 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57\": container with ID starting with 3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57 not found: ID does not exist" containerID="3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.101692 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57"} err="failed to get container status \"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57\": rpc error: code = NotFound desc = could not find container \"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57\": container with ID starting with 3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57 not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.101720 4775 scope.go:117] "RemoveContainer" containerID="b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.105189 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1"} err="failed to get container status \"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1\": rpc error: code = NotFound desc = could not find container \"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1\": container with ID starting with b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1 not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.105221 4775 scope.go:117] "RemoveContainer" containerID="3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.105275 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.110503 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57"} err="failed to get container status \"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57\": rpc error: code = NotFound desc = could not find container \"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57\": container with ID starting with 3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57 not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.111217 4775 scope.go:117] "RemoveContainer" containerID="1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.112654 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.112937 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.112978 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.112992 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.113006 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.123547 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124021 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124041 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124052 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124060 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124075 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon-log" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124083 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon-log" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124109 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124115 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124133 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-log" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124140 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-log" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124154 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-server" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124160 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-server" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124340 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon-log" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124358 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124369 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124379 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-server" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124396 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-log" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124406 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.125612 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.133416 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.133532 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.151250 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" podStartSLOduration=5.151228649 podStartE2EDuration="5.151228649s" podCreationTimestamp="2026-01-27 11:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:59.054275714 +0000 UTC m=+1118.195873491" watchObservedRunningTime="2026-01-27 11:38:59.151228649 +0000 UTC m=+1118.292826416" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.171509 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.178569 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6695647446-72d6k" podStartSLOduration=5.178436213 podStartE2EDuration="5.178436213s" podCreationTimestamp="2026-01-27 11:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:59.091828114 +0000 UTC m=+1118.233425901" watchObservedRunningTime="2026-01-27 11:38:59.178436213 +0000 UTC m=+1118.320033990" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.193620 4775 scope.go:117] "RemoveContainer" containerID="164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214422 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214489 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214539 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214569 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214609 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214635 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjc88\" (UniqueName: \"kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214691 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.254642 4775 scope.go:117] "RemoveContainer" containerID="1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.267793 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b\": container with ID starting with 1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b not found: ID does not exist" containerID="1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.267838 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b"} err="failed to get container status \"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b\": rpc error: code = NotFound desc = could not find container \"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b\": container with ID starting with 1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.267865 4775 scope.go:117] "RemoveContainer" containerID="164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.269463 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b\": container with ID starting with 164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b not found: ID does not exist" containerID="164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.269516 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b"} err="failed to get container status \"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b\": rpc error: code = NotFound desc = could not find container \"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b\": container with ID starting with 164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.288326 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.297555 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318541 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318590 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318624 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjc88\" (UniqueName: \"kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318688 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318745 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318792 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318819 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318843 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.319079 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.319134 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.319226 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.322847 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.322960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.323272 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.329419 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.339535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjc88\" (UniqueName: \"kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.356149 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.470519 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.520340 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.520398 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.800872 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b138b14c-964d-465d-a534-c7aff1633e76" path="/var/lib/kubelet/pods/b138b14c-964d-465d-a534-c7aff1633e76/volumes" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.807280 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" path="/var/lib/kubelet/pods/fe6b32f3-f53f-43ba-a349-2f00d5e657d0/volumes" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.845139 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.861893 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928242 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928296 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928410 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928518 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqt5z\" (UniqueName: \"kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928587 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928602 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928664 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.933084 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs" (OuterVolumeSpecName: "logs") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.936727 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z" (OuterVolumeSpecName: "kube-api-access-nqt5z") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "kube-api-access-nqt5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.942626 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.996619 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.019750 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data" (OuterVolumeSpecName: "config-data") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.030411 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqt5z\" (UniqueName: \"kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.030439 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.030466 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.030479 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.030491 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.032546 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.050708 4775 generic.go:334] "Generic (PLEG): container finished" podID="9862a859-ad75-4071-ad9a-ec926175e46d" containerID="9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3" exitCode=143 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.050879 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerDied","Data":"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.065600 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.067711 4775 generic.go:334] "Generic (PLEG): container finished" podID="31617f30-7431-401d-8c41-230d6a49ff72" containerID="c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63" exitCode=143 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.067802 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerDied","Data":"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082622 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerID="a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" exitCode=0 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082677 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerID="c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" exitCode=143 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082733 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerDied","Data":"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082792 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerDied","Data":"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082803 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerDied","Data":"332dd1d5955a659196a69a4a345219a2406c5c86fb913a32323480dc0fd29f46"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082826 4775 scope.go:117] "RemoveContainer" containerID="a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.083055 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.110549 4775 generic.go:334] "Generic (PLEG): container finished" podID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerID="2f5a6906cc8f471f0d04ad0bdc4a6f5a9284f2bae71c74883779afada2270d60" exitCode=0 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.110675 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerDied","Data":"2f5a6906cc8f471f0d04ad0bdc4a6f5a9284f2bae71c74883779afada2270d60"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.120846 4775 generic.go:334] "Generic (PLEG): container finished" podID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerID="9d13207bfa59faf596deb2d40a70b14097428a29e9cd2f29e431ec69fafe695f" exitCode=0 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.120895 4775 generic.go:334] "Generic (PLEG): container finished" podID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerID="0fa47ced9f0a1a66931599424fb0e02e42c9c45fd055acdeb51c078cfec19eb2" exitCode=143 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.120935 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerDied","Data":"9d13207bfa59faf596deb2d40a70b14097428a29e9cd2f29e431ec69fafe695f"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.120984 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerDied","Data":"0fa47ced9f0a1a66931599424fb0e02e42c9c45fd055acdeb51c078cfec19eb2"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132205 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerStarted","Data":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132429 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-central-agent" containerID="cri-o://1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" gracePeriod=30 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132759 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132823 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="proxy-httpd" containerID="cri-o://cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" gracePeriod=30 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132875 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-notification-agent" containerID="cri-o://c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" gracePeriod=30 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132954 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="sg-core" containerID="cri-o://164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" gracePeriod=30 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.152111 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.152150 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.174324 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.175320 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.231025 4775 scope.go:117] "RemoveContainer" containerID="c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.234620 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.243219 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.614767303 podStartE2EDuration="13.243190384s" podCreationTimestamp="2026-01-27 11:38:47 +0000 UTC" firstStartedPulling="2026-01-27 11:38:48.391289366 +0000 UTC m=+1107.532887143" lastFinishedPulling="2026-01-27 11:38:59.019712447 +0000 UTC m=+1118.161310224" observedRunningTime="2026-01-27 11:39:00.167819377 +0000 UTC m=+1119.309417154" watchObservedRunningTime="2026-01-27 11:39:00.243190384 +0000 UTC m=+1119.384788161" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.253697 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.253861 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.253906 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.253941 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.254026 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.254065 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.254097 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7wrh\" (UniqueName: \"kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.254357 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.255324 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs" (OuterVolumeSpecName: "logs") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.255622 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.256728 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.256753 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.270631 4775 scope.go:117] "RemoveContainer" containerID="a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.277174 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.277683 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.283598 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts" (OuterVolumeSpecName: "scripts") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: E0127 11:39:00.283812 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27\": container with ID starting with a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27 not found: ID does not exist" containerID="a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.283849 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27"} err="failed to get container status \"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27\": rpc error: code = NotFound desc = could not find container \"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27\": container with ID starting with a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27 not found: ID does not exist" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.283874 4775 scope.go:117] "RemoveContainer" containerID="c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" Jan 27 11:39:00 crc kubenswrapper[4775]: E0127 11:39:00.285543 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df\": container with ID starting with c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df not found: ID does not exist" containerID="c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.285565 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df"} err="failed to get container status \"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df\": rpc error: code = NotFound desc = could not find container \"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df\": container with ID starting with c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df not found: ID does not exist" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.285579 4775 scope.go:117] "RemoveContainer" containerID="a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.291814 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27"} err="failed to get container status \"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27\": rpc error: code = NotFound desc = could not find container \"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27\": container with ID starting with a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27 not found: ID does not exist" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.291857 4775 scope.go:117] "RemoveContainer" containerID="c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.309099 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh" (OuterVolumeSpecName: "kube-api-access-p7wrh") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "kube-api-access-p7wrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.309667 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df"} err="failed to get container status \"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df\": rpc error: code = NotFound desc = could not find container \"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df\": container with ID starting with c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df not found: ID does not exist" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.354824 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.355582 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data" (OuterVolumeSpecName: "config-data") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359130 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359423 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359466 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359482 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359492 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359501 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359510 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7wrh\" (UniqueName: \"kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.363528 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.427695 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463201 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom\") pod \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463245 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjdqm\" (UniqueName: \"kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm\") pod \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data\") pod \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463404 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle\") pod \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463540 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs\") pod \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463902 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.468028 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs" (OuterVolumeSpecName: "logs") pod "ca1756aa-c8c1-4f8e-9871-05e044a80c84" (UID: "ca1756aa-c8c1-4f8e-9871-05e044a80c84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.473289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ca1756aa-c8c1-4f8e-9871-05e044a80c84" (UID: "ca1756aa-c8c1-4f8e-9871-05e044a80c84"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.479480 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm" (OuterVolumeSpecName: "kube-api-access-zjdqm") pod "ca1756aa-c8c1-4f8e-9871-05e044a80c84" (UID: "ca1756aa-c8c1-4f8e-9871-05e044a80c84"). InnerVolumeSpecName "kube-api-access-zjdqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.510240 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca1756aa-c8c1-4f8e-9871-05e044a80c84" (UID: "ca1756aa-c8c1-4f8e-9871-05e044a80c84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.565879 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.565920 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.565929 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.565938 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjdqm\" (UniqueName: \"kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.689315 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data" (OuterVolumeSpecName: "config-data") pod "ca1756aa-c8c1-4f8e-9871-05e044a80c84" (UID: "ca1756aa-c8c1-4f8e-9871-05e044a80c84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.769354 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.164202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerDied","Data":"815ca40b27fb4cea044b33dd23bf33c1b082f912269530f93879da29eb229030"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.164251 4775 scope.go:117] "RemoveContainer" containerID="2f5a6906cc8f471f0d04ad0bdc4a6f5a9284f2bae71c74883779afada2270d60" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.164383 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.169846 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.170162 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerDied","Data":"7725e0d31cab8fdd988ddc82ff5c6e00f8aac8edb67890b0869f5c2b5c515d21"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.170232 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212229 4775 generic.go:334] "Generic (PLEG): container finished" podID="ee6187b7-adff-4247-b9de-00f16380f27f" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" exitCode=0 Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212262 4775 generic.go:334] "Generic (PLEG): container finished" podID="ee6187b7-adff-4247-b9de-00f16380f27f" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" exitCode=2 Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212272 4775 generic.go:334] "Generic (PLEG): container finished" podID="ee6187b7-adff-4247-b9de-00f16380f27f" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" exitCode=0 Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212278 4775 generic.go:334] "Generic (PLEG): container finished" podID="ee6187b7-adff-4247-b9de-00f16380f27f" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" exitCode=0 Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212317 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerDied","Data":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerDied","Data":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212350 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerDied","Data":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerDied","Data":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212368 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerDied","Data":"8683e19a7bcd30570af286ce01224a28b785c454609defaa562ddd8aa8e80071"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212430 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.229898 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerStarted","Data":"866d4ea18d0c5a4b3ffee3bd292c679cf834786becc86951c207630b9977d97c"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.234697 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.240769 4775 scope.go:117] "RemoveContainer" containerID="8ece19255413b1f459b9b434879cd49c181c9d1e505f96017ef83628747fdd1b" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.278823 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279587 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvx4h\" (UniqueName: \"kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279625 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279674 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279707 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279736 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279783 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279830 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.280750 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.281112 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.283570 4775 generic.go:334] "Generic (PLEG): container finished" podID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerID="42504908b6e8629c4bfd13d446379584c5e9631e5f21f9d0d03ceb47fe02eefd" exitCode=0 Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.284431 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.284587 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-695f7dfd45-zbb58" event={"ID":"ac6a9582-6a97-46b4-aa84-35ca9abe695c","Type":"ContainerDied","Data":"42504908b6e8629c4bfd13d446379584c5e9631e5f21f9d0d03ceb47fe02eefd"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.296603 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts" (OuterVolumeSpecName: "scripts") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.303907 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h" (OuterVolumeSpecName: "kube-api-access-tvx4h") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "kube-api-access-tvx4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.319613 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.325092 4775 scope.go:117] "RemoveContainer" containerID="9d13207bfa59faf596deb2d40a70b14097428a29e9cd2f29e431ec69fafe695f" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.343800 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.394290 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs\") pod \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.394360 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom\") pod \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.394401 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle\") pod \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.394514 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh92h\" (UniqueName: \"kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h\") pod \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.394544 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data\") pod \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.397082 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs" (OuterVolumeSpecName: "logs") pod "ac6a9582-6a97-46b4-aa84-35ca9abe695c" (UID: "ac6a9582-6a97-46b4-aa84-35ca9abe695c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.398877 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.398893 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.398910 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.398920 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.398929 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvx4h\" (UniqueName: \"kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.407627 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.408364 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h" (OuterVolumeSpecName: "kube-api-access-hh92h") pod "ac6a9582-6a97-46b4-aa84-35ca9abe695c" (UID: "ac6a9582-6a97-46b4-aa84-35ca9abe695c"). InnerVolumeSpecName "kube-api-access-hh92h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.408411 4775 scope.go:117] "RemoveContainer" containerID="0fa47ced9f0a1a66931599424fb0e02e42c9c45fd055acdeb51c078cfec19eb2" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.412865 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ac6a9582-6a97-46b4-aa84-35ca9abe695c" (UID: "ac6a9582-6a97-46b4-aa84-35ca9abe695c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424347 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424721 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424737 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424754 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="sg-core" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424760 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="sg-core" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424770 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424775 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424784 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-notification-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424789 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-notification-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424798 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424804 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-log" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424815 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424820 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api-log" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424830 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424836 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker-log" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424843 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424849 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener-log" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424861 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="proxy-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424867 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="proxy-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424883 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424890 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424901 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-central-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424909 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-central-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424927 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424933 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425072 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425084 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425095 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425101 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425114 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425126 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-central-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425139 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="sg-core" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425146 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425156 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425168 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-notification-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425176 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="proxy-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425187 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.426130 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.428511 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.428664 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.428912 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac6a9582-6a97-46b4-aa84-35ca9abe695c" (UID: "ac6a9582-6a97-46b4-aa84-35ca9abe695c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.447904 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.449175 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.463334 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.474255 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data" (OuterVolumeSpecName: "config-data") pod "ac6a9582-6a97-46b4-aa84-35ca9abe695c" (UID: "ac6a9582-6a97-46b4-aa84-35ca9abe695c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.494113 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data" (OuterVolumeSpecName: "config-data") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501153 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501200 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501225 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501249 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501264 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501295 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501799 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz8mh\" (UniqueName: \"kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501986 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh92h\" (UniqueName: \"kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508516 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508548 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508560 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508571 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508582 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508591 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.585507 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.600554 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.609956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610057 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz8mh\" (UniqueName: \"kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610112 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610136 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610157 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610178 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610221 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610888 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.611659 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.616170 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.616235 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.621596 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.627055 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.636747 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.636847 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.645339 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.645569 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.645601 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.650767 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz8mh\" (UniqueName: \"kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.650827 4775 scope.go:117] "RemoveContainer" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.656051 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.667621 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.677550 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.711778 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.711825 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.711946 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.711967 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.711994 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skgc8\" (UniqueName: \"kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.712042 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.713612 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.725560 4775 scope.go:117] "RemoveContainer" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.755866 4775 scope.go:117] "RemoveContainer" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.765347 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" path="/var/lib/kubelet/pods/134ee9b9-bd65-48fb-9593-d0f29112e77e/volumes" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.765979 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" path="/var/lib/kubelet/pods/ac6a9582-6a97-46b4-aa84-35ca9abe695c/volumes" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.766678 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" path="/var/lib/kubelet/pods/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64/volumes" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.767801 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" path="/var/lib/kubelet/pods/ca1756aa-c8c1-4f8e-9871-05e044a80c84/volumes" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.768379 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" path="/var/lib/kubelet/pods/ee6187b7-adff-4247-b9de-00f16380f27f/volumes" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.782158 4775 scope.go:117] "RemoveContainer" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.814999 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skgc8\" (UniqueName: \"kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815071 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815097 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815143 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815161 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815252 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815273 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.816801 4775 scope.go:117] "RemoveContainer" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.816998 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.817988 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": container with ID starting with cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15 not found: ID does not exist" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.818032 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} err="failed to get container status \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": rpc error: code = NotFound desc = could not find container \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": container with ID starting with cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.818057 4775 scope.go:117] "RemoveContainer" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.818169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.818343 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": container with ID starting with 164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4 not found: ID does not exist" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.818376 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} err="failed to get container status \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": rpc error: code = NotFound desc = could not find container \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": container with ID starting with 164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.818395 4775 scope.go:117] "RemoveContainer" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.819380 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": container with ID starting with c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df not found: ID does not exist" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819419 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} err="failed to get container status \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": rpc error: code = NotFound desc = could not find container \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": container with ID starting with c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819432 4775 scope.go:117] "RemoveContainer" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.819658 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": container with ID starting with 1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd not found: ID does not exist" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819678 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} err="failed to get container status \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": rpc error: code = NotFound desc = could not find container \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": container with ID starting with 1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819712 4775 scope.go:117] "RemoveContainer" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819903 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} err="failed to get container status \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": rpc error: code = NotFound desc = could not find container \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": container with ID starting with cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819918 4775 scope.go:117] "RemoveContainer" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.821183 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.821527 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} err="failed to get container status \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": rpc error: code = NotFound desc = could not find container \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": container with ID starting with 164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.821560 4775 scope.go:117] "RemoveContainer" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.822936 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.824207 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.826550 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} err="failed to get container status \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": rpc error: code = NotFound desc = could not find container \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": container with ID starting with c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.826581 4775 scope.go:117] "RemoveContainer" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.826830 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} err="failed to get container status \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": rpc error: code = NotFound desc = could not find container \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": container with ID starting with 1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.826853 4775 scope.go:117] "RemoveContainer" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827110 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} err="failed to get container status \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": rpc error: code = NotFound desc = could not find container \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": container with ID starting with cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827158 4775 scope.go:117] "RemoveContainer" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827177 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827474 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} err="failed to get container status \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": rpc error: code = NotFound desc = could not find container \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": container with ID starting with 164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827517 4775 scope.go:117] "RemoveContainer" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827752 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} err="failed to get container status \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": rpc error: code = NotFound desc = could not find container \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": container with ID starting with c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827781 4775 scope.go:117] "RemoveContainer" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827939 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} err="failed to get container status \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": rpc error: code = NotFound desc = could not find container \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": container with ID starting with 1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828004 4775 scope.go:117] "RemoveContainer" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828174 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} err="failed to get container status \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": rpc error: code = NotFound desc = could not find container \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": container with ID starting with cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828192 4775 scope.go:117] "RemoveContainer" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828317 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} err="failed to get container status \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": rpc error: code = NotFound desc = could not find container \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": container with ID starting with 164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828334 4775 scope.go:117] "RemoveContainer" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828514 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} err="failed to get container status \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": rpc error: code = NotFound desc = could not find container \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": container with ID starting with c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828535 4775 scope.go:117] "RemoveContainer" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828820 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} err="failed to get container status \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": rpc error: code = NotFound desc = could not find container \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": container with ID starting with 1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828840 4775 scope.go:117] "RemoveContainer" containerID="42504908b6e8629c4bfd13d446379584c5e9631e5f21f9d0d03ceb47fe02eefd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.834193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skgc8\" (UniqueName: \"kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.881506 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.897967 4775 scope.go:117] "RemoveContainer" containerID="156c73760afe4bfaf528d085e9a2fb00e063fb27928a61dc8179d4c23fd740db" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.989649 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.007340 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.323961 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerStarted","Data":"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a"} Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.324337 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerStarted","Data":"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb"} Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.324349 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-log" containerID="cri-o://bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" gracePeriod=30 Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.325123 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-httpd" containerID="cri-o://0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" gracePeriod=30 Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.354801 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.354787822 podStartE2EDuration="3.354787822s" podCreationTimestamp="2026-01-27 11:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:02.353349052 +0000 UTC m=+1121.494946839" watchObservedRunningTime="2026-01-27 11:39:02.354787822 +0000 UTC m=+1121.496385599" Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.515939 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.550042 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:02 crc kubenswrapper[4775]: W0127 11:39:02.572743 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e7f615a_76c9_440f_aee6_0d33ad750021.slice/crio-54fbd787f49c4f61fd7730c91b459bd75d94e94c1b796cc8a26f95081ed92545 WatchSource:0}: Error finding container 54fbd787f49c4f61fd7730c91b459bd75d94e94c1b796cc8a26f95081ed92545: Status 404 returned error can't find the container with id 54fbd787f49c4f61fd7730c91b459bd75d94e94c1b796cc8a26f95081ed92545 Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.228162 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351289 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351703 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351757 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351816 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjc88\" (UniqueName: \"kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351841 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351938 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351956 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.352007 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.352166 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs" (OuterVolumeSpecName: "logs") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.352601 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.353352 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.353376 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.358573 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.364628 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts" (OuterVolumeSpecName: "scripts") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.364638 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88" (OuterVolumeSpecName: "kube-api-access-xjc88") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "kube-api-access-xjc88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.390258 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerStarted","Data":"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.390303 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerStarted","Data":"3e083a2d7efd3e69a8c080a7d1e8f3788d6a8410341ddd6b9bdeed055744c49d"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.395901 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerStarted","Data":"54fbd787f49c4f61fd7730c91b459bd75d94e94c1b796cc8a26f95081ed92545"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.403406 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404393 4775 generic.go:334] "Generic (PLEG): container finished" podID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerID="0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" exitCode=0 Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404431 4775 generic.go:334] "Generic (PLEG): container finished" podID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerID="bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" exitCode=143 Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerDied","Data":"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404504 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerDied","Data":"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerDied","Data":"866d4ea18d0c5a4b3ffee3bd292c679cf834786becc86951c207630b9977d97c"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404546 4775 scope.go:117] "RemoveContainer" containerID="0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404920 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.428211 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.442354 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data" (OuterVolumeSpecName: "config-data") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.443072 4775 scope.go:117] "RemoveContainer" containerID="bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455215 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455243 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455252 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjc88\" (UniqueName: \"kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455262 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455291 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455302 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.473800 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.556735 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.571371 4775 scope.go:117] "RemoveContainer" containerID="0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" Jan 27 11:39:03 crc kubenswrapper[4775]: E0127 11:39:03.572535 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb\": container with ID starting with 0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb not found: ID does not exist" containerID="0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.572580 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb"} err="failed to get container status \"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb\": rpc error: code = NotFound desc = could not find container \"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb\": container with ID starting with 0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb not found: ID does not exist" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.572611 4775 scope.go:117] "RemoveContainer" containerID="bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" Jan 27 11:39:03 crc kubenswrapper[4775]: E0127 11:39:03.573168 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a\": container with ID starting with bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a not found: ID does not exist" containerID="bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.573201 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a"} err="failed to get container status \"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a\": rpc error: code = NotFound desc = could not find container \"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a\": container with ID starting with bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a not found: ID does not exist" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.573223 4775 scope.go:117] "RemoveContainer" containerID="0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.573651 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb"} err="failed to get container status \"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb\": rpc error: code = NotFound desc = could not find container \"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb\": container with ID starting with 0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb not found: ID does not exist" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.573745 4775 scope.go:117] "RemoveContainer" containerID="bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.574393 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a"} err="failed to get container status \"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a\": rpc error: code = NotFound desc = could not find container \"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a\": container with ID starting with bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a not found: ID does not exist" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.762622 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.782413 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.815406 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:03 crc kubenswrapper[4775]: E0127 11:39:03.815867 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-httpd" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.815885 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-httpd" Jan 27 11:39:03 crc kubenswrapper[4775]: E0127 11:39:03.815900 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-log" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.815908 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-log" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.816061 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-log" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.816081 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-httpd" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.817105 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.819535 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.824295 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.830974 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966199 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-logs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966262 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966302 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-config-data\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966363 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966431 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-scripts\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966597 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966650 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbcz2\" (UniqueName: \"kubernetes.io/projected/899a9893-167d-4c9c-9495-3c663c7d0855-kube-api-access-vbcz2\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068535 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068600 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbcz2\" (UniqueName: \"kubernetes.io/projected/899a9893-167d-4c9c-9495-3c663c7d0855-kube-api-access-vbcz2\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-logs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068684 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-config-data\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068789 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-scripts\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.069040 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.069298 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.069330 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-logs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.075643 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.075852 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.075984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-config-data\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.092054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbcz2\" (UniqueName: \"kubernetes.io/projected/899a9893-167d-4c9c-9495-3c663c7d0855-kube-api-access-vbcz2\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.115009 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.119729 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-scripts\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.145489 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.417776 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerStarted","Data":"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3"} Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.417928 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-log" containerID="cri-o://3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" gracePeriod=30 Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.418369 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-httpd" containerID="cri-o://e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" gracePeriod=30 Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.425906 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerStarted","Data":"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84"} Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.425951 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerStarted","Data":"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1"} Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.437238 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.446690 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.446673384 podStartE2EDuration="3.446673384s" podCreationTimestamp="2026-01-27 11:39:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:04.443295111 +0000 UTC m=+1123.584892888" watchObservedRunningTime="2026-01-27 11:39:04.446673384 +0000 UTC m=+1123.588271161" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.583062 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c59c678b7-lbtkp"] Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.584908 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.596943 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c59c678b7-lbtkp"] Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679217 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-public-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679270 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-ovndb-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679301 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-httpd-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679372 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-combined-ca-bundle\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679476 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-internal-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679522 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjl8l\" (UniqueName: \"kubernetes.io/projected/857ed116-b219-4af4-9c38-69e85db0c484-kube-api-access-vjl8l\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679538 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.781745 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-public-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.781808 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-ovndb-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.781837 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-httpd-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.781862 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-combined-ca-bundle\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.782129 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-internal-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.782216 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjl8l\" (UniqueName: \"kubernetes.io/projected/857ed116-b219-4af4-9c38-69e85db0c484-kube-api-access-vjl8l\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.782257 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.790673 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-public-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.791973 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-combined-ca-bundle\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.793975 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-internal-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.794211 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-ovndb-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.794219 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-httpd-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.796755 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.803771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjl8l\" (UniqueName: \"kubernetes.io/projected/857ed116-b219-4af4-9c38-69e85db0c484-kube-api-access-vjl8l\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.914624 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.056966 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.092102 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.190881 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191048 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191072 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz8mh\" (UniqueName: \"kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191097 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191428 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191478 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191530 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191548 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.192191 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.192382 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs" (OuterVolumeSpecName: "logs") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.197810 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts" (OuterVolumeSpecName: "scripts") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.198858 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh" (OuterVolumeSpecName: "kube-api-access-zz8mh") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "kube-api-access-zz8mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.202552 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.237534 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.260329 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.265801 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data" (OuterVolumeSpecName: "config-data") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293430 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293476 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293512 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293521 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293533 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293540 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293548 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz8mh\" (UniqueName: \"kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293557 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.321644 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.395293 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.442046 4775 generic.go:334] "Generic (PLEG): container finished" podID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerID="e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" exitCode=0 Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.442080 4775 generic.go:334] "Generic (PLEG): container finished" podID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerID="3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" exitCode=143 Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.442160 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.444502 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerDied","Data":"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3"} Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.444578 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerDied","Data":"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2"} Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.444592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerDied","Data":"3e083a2d7efd3e69a8c080a7d1e8f3788d6a8410341ddd6b9bdeed055744c49d"} Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.444608 4775 scope.go:117] "RemoveContainer" containerID="e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.458394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerStarted","Data":"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9"} Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.465199 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"899a9893-167d-4c9c-9495-3c663c7d0855","Type":"ContainerStarted","Data":"4b5a2197f63f7e229b480975e868d0f8d5dab8b2b247e1b010060a420354577c"} Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.543085 4775 scope.go:117] "RemoveContainer" containerID="3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.544383 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.553463 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.567703 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:05 crc kubenswrapper[4775]: E0127 11:39:05.568055 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-httpd" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.568071 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-httpd" Jan 27 11:39:05 crc kubenswrapper[4775]: E0127 11:39:05.568094 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-log" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.568100 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-log" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.568268 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-log" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.568286 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-httpd" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.569180 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.572728 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.572938 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.599233 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c59c678b7-lbtkp"] Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.626607 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.641021 4775 scope.go:117] "RemoveContainer" containerID="e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" Jan 27 11:39:05 crc kubenswrapper[4775]: E0127 11:39:05.642129 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3\": container with ID starting with e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3 not found: ID does not exist" containerID="e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.642172 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3"} err="failed to get container status \"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3\": rpc error: code = NotFound desc = could not find container \"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3\": container with ID starting with e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3 not found: ID does not exist" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.642200 4775 scope.go:117] "RemoveContainer" containerID="3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" Jan 27 11:39:05 crc kubenswrapper[4775]: E0127 11:39:05.644033 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2\": container with ID starting with 3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2 not found: ID does not exist" containerID="3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.644078 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2"} err="failed to get container status \"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2\": rpc error: code = NotFound desc = could not find container \"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2\": container with ID starting with 3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2 not found: ID does not exist" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.644112 4775 scope.go:117] "RemoveContainer" containerID="e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.644417 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3"} err="failed to get container status \"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3\": rpc error: code = NotFound desc = could not find container \"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3\": container with ID starting with e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3 not found: ID does not exist" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.644443 4775 scope.go:117] "RemoveContainer" containerID="3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.644829 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2"} err="failed to get container status \"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2\": rpc error: code = NotFound desc = could not find container \"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2\": container with ID starting with 3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2 not found: ID does not exist" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705324 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7vc7\" (UniqueName: \"kubernetes.io/projected/2d8a9ef1-1171-438f-be81-89f670bd9735-kube-api-access-t7vc7\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705414 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705433 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705468 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705519 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705548 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705617 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705691 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.756851 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" path="/var/lib/kubelet/pods/34fbc599-e3e9-4317-a306-f1b4d677cd84/volumes" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.757714 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" path="/var/lib/kubelet/pods/d4944db9-7805-486d-bd2f-38245c9eecbf/volumes" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.806854 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.806915 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.806954 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.807097 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.807145 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.807191 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7vc7\" (UniqueName: \"kubernetes.io/projected/2d8a9ef1-1171-438f-be81-89f670bd9735-kube-api-access-t7vc7\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.807269 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.807289 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.808977 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.809923 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.810783 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.813255 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.815826 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.816052 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.830438 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.841198 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7vc7\" (UniqueName: \"kubernetes.io/projected/2d8a9ef1-1171-438f-be81-89f670bd9735-kube-api-access-t7vc7\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.879890 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.902763 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.477952 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"899a9893-167d-4c9c-9495-3c663c7d0855","Type":"ContainerStarted","Data":"c86790369dd857d1339351cb0cc3b769915d8acc3e30cef67b677d972658fab5"} Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.480931 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c59c678b7-lbtkp" event={"ID":"857ed116-b219-4af4-9c38-69e85db0c484","Type":"ContainerStarted","Data":"60553b81c31faad82df95aaef48f5879e450ffeefbcf8061f9e527a356e9485b"} Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.480956 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c59c678b7-lbtkp" event={"ID":"857ed116-b219-4af4-9c38-69e85db0c484","Type":"ContainerStarted","Data":"f13b1d6e397378d1eee56a32024bde6e8c9531482a5a4d55f375e52235aecf63"} Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.480966 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c59c678b7-lbtkp" event={"ID":"857ed116-b219-4af4-9c38-69e85db0c484","Type":"ContainerStarted","Data":"19c5e38a7afe1201ed2fcf03488fde5358e005044bc9b2f06a799a829ad93eda"} Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.481652 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.514678 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c59c678b7-lbtkp" podStartSLOduration=2.514654335 podStartE2EDuration="2.514654335s" podCreationTimestamp="2026-01-27 11:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:06.500353749 +0000 UTC m=+1125.641951526" watchObservedRunningTime="2026-01-27 11:39:06.514654335 +0000 UTC m=+1125.656252112" Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.547239 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.497934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerStarted","Data":"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2"} Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.498375 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-central-agent" containerID="cri-o://aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1" gracePeriod=30 Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.498662 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.498900 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="proxy-httpd" containerID="cri-o://fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2" gracePeriod=30 Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.498958 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="sg-core" containerID="cri-o://ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9" gracePeriod=30 Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.498994 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-notification-agent" containerID="cri-o://511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84" gracePeriod=30 Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.514900 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"899a9893-167d-4c9c-9495-3c663c7d0855","Type":"ContainerStarted","Data":"549b5fac54367f7d975358805a6394a66ed3b495f484400d7487054f1267fe23"} Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.532586 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d8a9ef1-1171-438f-be81-89f670bd9735","Type":"ContainerStarted","Data":"8afdb7a742646a4746b16324c837e645ca5009f0a8b343509ea9298ff6e74d52"} Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.532645 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d8a9ef1-1171-438f-be81-89f670bd9735","Type":"ContainerStarted","Data":"a568fb9e5bd89e96bdc6bd85d9565d6d4542a090a3f31f684a01c2bb795fc74c"} Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.534343 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.991628057 podStartE2EDuration="6.534320359s" podCreationTimestamp="2026-01-27 11:39:01 +0000 UTC" firstStartedPulling="2026-01-27 11:39:02.578176548 +0000 UTC m=+1121.719774325" lastFinishedPulling="2026-01-27 11:39:07.12086885 +0000 UTC m=+1126.262466627" observedRunningTime="2026-01-27 11:39:07.523186281 +0000 UTC m=+1126.664784058" watchObservedRunningTime="2026-01-27 11:39:07.534320359 +0000 UTC m=+1126.675918126" Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.555783 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.555760462 podStartE2EDuration="4.555760462s" podCreationTimestamp="2026-01-27 11:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:07.543860962 +0000 UTC m=+1126.685458729" watchObservedRunningTime="2026-01-27 11:39:07.555760462 +0000 UTC m=+1126.697358239" Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.007850 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.491552 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.570874 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.571110 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8bc6678d8-674l9" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api-log" containerID="cri-o://9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b" gracePeriod=30 Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.571234 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8bc6678d8-674l9" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api" containerID="cri-o://88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c" gracePeriod=30 Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.573531 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d8a9ef1-1171-438f-be81-89f670bd9735","Type":"ContainerStarted","Data":"af0056b03ab936b02a4f430068d2d283b8afd820daef0e50054f68024a0623c1"} Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.595682 4775 generic.go:334] "Generic (PLEG): container finished" podID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerID="fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2" exitCode=0 Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.595716 4775 generic.go:334] "Generic (PLEG): container finished" podID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerID="ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9" exitCode=2 Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.595725 4775 generic.go:334] "Generic (PLEG): container finished" podID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerID="511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84" exitCode=0 Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.596525 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerDied","Data":"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2"} Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.596549 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerDied","Data":"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9"} Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.596560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerDied","Data":"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84"} Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.608896 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.608878362 podStartE2EDuration="3.608878362s" podCreationTimestamp="2026-01-27 11:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:08.598228177 +0000 UTC m=+1127.739825944" watchObservedRunningTime="2026-01-27 11:39:08.608878362 +0000 UTC m=+1127.750476139" Jan 27 11:39:08 crc kubenswrapper[4775]: E0127 11:39:08.766252 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59717e39_e3c7_40b2_89c7_7b898f3b72e7.slice/crio-conmon-9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59717e39_e3c7_40b2_89c7_7b898f3b72e7.slice/crio-9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:39:09 crc kubenswrapper[4775]: I0127 11:39:09.639166 4775 generic.go:334] "Generic (PLEG): container finished" podID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerID="9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b" exitCode=143 Jan 27 11:39:09 crc kubenswrapper[4775]: I0127 11:39:09.639487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerDied","Data":"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b"} Jan 27 11:39:10 crc kubenswrapper[4775]: I0127 11:39:10.648867 4775 generic.go:334] "Generic (PLEG): container finished" podID="1b5e7b0a-a4d0-4c64-b273-2b47230efd17" containerID="cd7130b87032009eafbd9299811458b2c0b7a08141bac0e7bfbe791fc49ad4d0" exitCode=0 Jan 27 11:39:10 crc kubenswrapper[4775]: I0127 11:39:10.648943 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" event={"ID":"1b5e7b0a-a4d0-4c64-b273-2b47230efd17","Type":"ContainerDied","Data":"cd7130b87032009eafbd9299811458b2c0b7a08141bac0e7bfbe791fc49ad4d0"} Jan 27 11:39:11 crc kubenswrapper[4775]: I0127 11:39:11.851493 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8bc6678d8-674l9" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": dial tcp 10.217.0.164:9311: connect: connection refused" Jan 27 11:39:11 crc kubenswrapper[4775]: I0127 11:39:11.854166 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8bc6678d8-674l9" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": dial tcp 10.217.0.164:9311: connect: connection refused" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.006637 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.083665 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle\") pod \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.083752 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data\") pod \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.083792 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbw4n\" (UniqueName: \"kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n\") pod \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.083893 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts\") pod \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.088749 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n" (OuterVolumeSpecName: "kube-api-access-vbw4n") pod "1b5e7b0a-a4d0-4c64-b273-2b47230efd17" (UID: "1b5e7b0a-a4d0-4c64-b273-2b47230efd17"). InnerVolumeSpecName "kube-api-access-vbw4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.092361 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts" (OuterVolumeSpecName: "scripts") pod "1b5e7b0a-a4d0-4c64-b273-2b47230efd17" (UID: "1b5e7b0a-a4d0-4c64-b273-2b47230efd17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.115308 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data" (OuterVolumeSpecName: "config-data") pod "1b5e7b0a-a4d0-4c64-b273-2b47230efd17" (UID: "1b5e7b0a-a4d0-4c64-b273-2b47230efd17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.116374 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b5e7b0a-a4d0-4c64-b273-2b47230efd17" (UID: "1b5e7b0a-a4d0-4c64-b273-2b47230efd17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.185901 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.185947 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.185962 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbw4n\" (UniqueName: \"kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.185977 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.211666 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.343054 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390086 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390164 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsdsn\" (UniqueName: \"kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390351 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390383 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390405 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390444 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.393218 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs" (OuterVolumeSpecName: "logs") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.399546 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.399784 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn" (OuterVolumeSpecName: "kube-api-access-jsdsn") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "kube-api-access-jsdsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.438444 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.442629 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.444289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data" (OuterVolumeSpecName: "config-data") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.445769 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492354 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492422 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492483 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492516 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492618 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skgc8\" (UniqueName: \"kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492670 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492818 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493064 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493086 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493098 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493106 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493115 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493124 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493132 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493140 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsdsn\" (UniqueName: \"kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493112 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.495885 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8" (OuterVolumeSpecName: "kube-api-access-skgc8") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "kube-api-access-skgc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.497206 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts" (OuterVolumeSpecName: "scripts") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.515939 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.559375 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.594549 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skgc8\" (UniqueName: \"kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.594585 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.594594 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.594602 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.594612 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.595035 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data" (OuterVolumeSpecName: "config-data") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.667046 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" event={"ID":"1b5e7b0a-a4d0-4c64-b273-2b47230efd17","Type":"ContainerDied","Data":"d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0"} Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.667399 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.667081 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.669932 4775 generic.go:334] "Generic (PLEG): container finished" podID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerID="88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c" exitCode=0 Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.670001 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.670006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerDied","Data":"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c"} Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.670118 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerDied","Data":"7741a0906f599fd7687720fdb78021f6c23a07fdd0533bbdc83dc1e97a16a161"} Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.670138 4775 scope.go:117] "RemoveContainer" containerID="88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.673703 4775 generic.go:334] "Generic (PLEG): container finished" podID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerID="aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1" exitCode=0 Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.673749 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerDied","Data":"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1"} Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.673775 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerDied","Data":"54fbd787f49c4f61fd7730c91b459bd75d94e94c1b796cc8a26f95081ed92545"} Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.673800 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.696357 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.731381 4775 scope.go:117] "RemoveContainer" containerID="9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.732787 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.743397 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.753340 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.766765 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.766831 4775 scope.go:117] "RemoveContainer" containerID="88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.767904 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c\": container with ID starting with 88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c not found: ID does not exist" containerID="88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.767970 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c"} err="failed to get container status \"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c\": rpc error: code = NotFound desc = could not find container \"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c\": container with ID starting with 88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c not found: ID does not exist" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.768018 4775 scope.go:117] "RemoveContainer" containerID="9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.768508 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b\": container with ID starting with 9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b not found: ID does not exist" containerID="9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.768531 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b"} err="failed to get container status \"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b\": rpc error: code = NotFound desc = could not find container \"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b\": container with ID starting with 9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b not found: ID does not exist" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.768546 4775 scope.go:117] "RemoveContainer" containerID="fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.778434 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.778884 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-notification-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.778907 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-notification-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.778932 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.778941 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.778953 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="sg-core" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.778961 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="sg-core" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.778977 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-central-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.778986 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-central-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.779009 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="proxy-httpd" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779017 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="proxy-httpd" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.779042 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5e7b0a-a4d0-4c64-b273-2b47230efd17" containerName="nova-cell0-conductor-db-sync" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779050 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5e7b0a-a4d0-4c64-b273-2b47230efd17" containerName="nova-cell0-conductor-db-sync" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.779060 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api-log" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779068 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api-log" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779312 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api-log" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779334 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779352 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="sg-core" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779365 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-central-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779378 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="proxy-httpd" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779389 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-notification-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779404 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5e7b0a-a4d0-4c64-b273-2b47230efd17" containerName="nova-cell0-conductor-db-sync" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.782098 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.787072 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.787930 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.793446 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.811799 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.815669 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.837938 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kp5gz" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.838445 4775 scope.go:117] "RemoveContainer" containerID="ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.840776 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.847941 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.899720 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.899828 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.899896 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.899923 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.899948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtljn\" (UniqueName: \"kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.900007 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhrxz\" (UniqueName: \"kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.900030 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.900070 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.900110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.900160 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.917694 4775 scope.go:117] "RemoveContainer" containerID="511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.946555 4775 scope.go:117] "RemoveContainer" containerID="aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.966013 4775 scope.go:117] "RemoveContainer" containerID="fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.966441 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2\": container with ID starting with fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2 not found: ID does not exist" containerID="fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.966516 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2"} err="failed to get container status \"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2\": rpc error: code = NotFound desc = could not find container \"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2\": container with ID starting with fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2 not found: ID does not exist" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.966547 4775 scope.go:117] "RemoveContainer" containerID="ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.967681 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9\": container with ID starting with ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9 not found: ID does not exist" containerID="ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.967719 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9"} err="failed to get container status \"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9\": rpc error: code = NotFound desc = could not find container \"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9\": container with ID starting with ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9 not found: ID does not exist" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.967741 4775 scope.go:117] "RemoveContainer" containerID="511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.968226 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84\": container with ID starting with 511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84 not found: ID does not exist" containerID="511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.968255 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84"} err="failed to get container status \"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84\": rpc error: code = NotFound desc = could not find container \"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84\": container with ID starting with 511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84 not found: ID does not exist" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.968277 4775 scope.go:117] "RemoveContainer" containerID="aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.968526 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1\": container with ID starting with aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1 not found: ID does not exist" containerID="aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.968549 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1"} err="failed to get container status \"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1\": rpc error: code = NotFound desc = could not find container \"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1\": container with ID starting with aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1 not found: ID does not exist" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001755 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001777 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtljn\" (UniqueName: \"kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001830 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhrxz\" (UniqueName: \"kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001853 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001881 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001949 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.002507 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.002967 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.007369 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.007387 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.007910 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.008343 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.008674 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.008967 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.026074 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtljn\" (UniqueName: \"kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.032772 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhrxz\" (UniqueName: \"kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.226784 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.237211 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.758894 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" path="/var/lib/kubelet/pods/0e7f615a-76c9-440f-aee6-0d33ad750021/volumes" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.760135 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" path="/var/lib/kubelet/pods/59717e39-e3c7-40b2-89c7-7b898f3b72e7/volumes" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.779154 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.795611 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.852494 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.438608 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.438841 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.486386 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.490275 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.535052 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.693623 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9","Type":"ContainerStarted","Data":"cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6"} Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.693665 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9","Type":"ContainerStarted","Data":"82d4d61311885172aa8b3e5cc80375eb709a13d1d92b08eb5c2530bda351308b"} Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.693770 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" containerID="cri-o://cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" gracePeriod=30 Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.694639 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.698676 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerStarted","Data":"2a2ee9ecd020ed63d838c367608617b5c5b9bef053fb9d27e529ac66f6e55c5a"} Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.698720 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.699033 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.709512 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.709492832 podStartE2EDuration="2.709492832s" podCreationTimestamp="2026-01-27 11:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:14.708471703 +0000 UTC m=+1133.850069490" watchObservedRunningTime="2026-01-27 11:39:14.709492832 +0000 UTC m=+1133.851090609" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.175980 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f49dbf586-l2cmp"] Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.177741 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.203269 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f49dbf586-l2cmp"] Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242556 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b3edac4-ba7b-4c93-b66f-43ab468d290f-logs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242639 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-combined-ca-bundle\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242727 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfs9b\" (UniqueName: \"kubernetes.io/projected/2b3edac4-ba7b-4c93-b66f-43ab468d290f-kube-api-access-pfs9b\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242767 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-scripts\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242804 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-public-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242830 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-internal-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242863 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-config-data\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.343956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b3edac4-ba7b-4c93-b66f-43ab468d290f-logs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.344200 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-combined-ca-bundle\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.344491 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfs9b\" (UniqueName: \"kubernetes.io/projected/2b3edac4-ba7b-4c93-b66f-43ab468d290f-kube-api-access-pfs9b\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.345142 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-scripts\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.345264 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-public-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.345377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-internal-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.345492 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-config-data\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.344517 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b3edac4-ba7b-4c93-b66f-43ab468d290f-logs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.349300 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-combined-ca-bundle\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.350236 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-scripts\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.350905 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-internal-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.353367 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-config-data\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.354923 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-public-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.359822 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfs9b\" (UniqueName: \"kubernetes.io/projected/2b3edac4-ba7b-4c93-b66f-43ab468d290f-kube-api-access-pfs9b\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.494931 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.714885 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerStarted","Data":"5c3d79aab2eaf39741cf0a1a88cf8bdc2458d431fe6b12dc6778f596671b970c"} Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.715237 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerStarted","Data":"19dbb05fee4e0f091562b6f8390365f161f03f64f8035720d6e2c940618fe907"} Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.904126 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.906123 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.946595 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.972636 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f49dbf586-l2cmp"] Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.993866 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.724611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerStarted","Data":"b7f67772ea6767fe5e5ebb612038b7900a441fee4eef11de26a544a863c1564c"} Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.726342 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f49dbf586-l2cmp" event={"ID":"2b3edac4-ba7b-4c93-b66f-43ab468d290f","Type":"ContainerStarted","Data":"8e6019244658e5cfc5a7245dc638e28477faa58477eb7d39abb672b26b32efcb"} Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.726441 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f49dbf586-l2cmp" event={"ID":"2b3edac4-ba7b-4c93-b66f-43ab468d290f","Type":"ContainerStarted","Data":"875c83b883fa3bd728bd63d7a970181ee212cc95fecb472fd0f7adc7fa462bcb"} Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.726822 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f49dbf586-l2cmp" event={"ID":"2b3edac4-ba7b-4c93-b66f-43ab468d290f","Type":"ContainerStarted","Data":"62ecae1c1001fdf4b7f185c0a5db15c9dc33752f0d26ed2714229213f160919a"} Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.726917 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.727000 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.746398 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.746494 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.752726 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f49dbf586-l2cmp" podStartSLOduration=1.752702127 podStartE2EDuration="1.752702127s" podCreationTimestamp="2026-01-27 11:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:16.75101122 +0000 UTC m=+1135.892608997" watchObservedRunningTime="2026-01-27 11:39:16.752702127 +0000 UTC m=+1135.894299904" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.753287 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 11:39:17 crc kubenswrapper[4775]: I0127 11:39:17.734940 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:17 crc kubenswrapper[4775]: I0127 11:39:17.734988 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.749997 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-central-agent" containerID="cri-o://19dbb05fee4e0f091562b6f8390365f161f03f64f8035720d6e2c940618fe907" gracePeriod=30 Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.750483 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerStarted","Data":"87c0c670f987fb5b699e39f1152f819ebcf54f73b798b5259ff6a7b344f01fb9"} Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.750528 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.750655 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="proxy-httpd" containerID="cri-o://87c0c670f987fb5b699e39f1152f819ebcf54f73b798b5259ff6a7b344f01fb9" gracePeriod=30 Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.750726 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="sg-core" containerID="cri-o://b7f67772ea6767fe5e5ebb612038b7900a441fee4eef11de26a544a863c1564c" gracePeriod=30 Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.750784 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-notification-agent" containerID="cri-o://5c3d79aab2eaf39741cf0a1a88cf8bdc2458d431fe6b12dc6778f596671b970c" gracePeriod=30 Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.773537 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.773818 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.783090 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.207016299 podStartE2EDuration="6.783072256s" podCreationTimestamp="2026-01-27 11:39:12 +0000 UTC" firstStartedPulling="2026-01-27 11:39:13.802173529 +0000 UTC m=+1132.943771306" lastFinishedPulling="2026-01-27 11:39:18.378229476 +0000 UTC m=+1137.519827263" observedRunningTime="2026-01-27 11:39:18.780194376 +0000 UTC m=+1137.921792173" watchObservedRunningTime="2026-01-27 11:39:18.783072256 +0000 UTC m=+1137.924670043" Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.824275 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:19 crc kubenswrapper[4775]: I0127 11:39:19.760414 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3ab198a-6671-407e-931d-e1e6dc109197" containerID="b7f67772ea6767fe5e5ebb612038b7900a441fee4eef11de26a544a863c1564c" exitCode=2 Jan 27 11:39:19 crc kubenswrapper[4775]: I0127 11:39:19.760847 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3ab198a-6671-407e-931d-e1e6dc109197" containerID="5c3d79aab2eaf39741cf0a1a88cf8bdc2458d431fe6b12dc6778f596671b970c" exitCode=0 Jan 27 11:39:19 crc kubenswrapper[4775]: I0127 11:39:19.760566 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerDied","Data":"b7f67772ea6767fe5e5ebb612038b7900a441fee4eef11de26a544a863c1564c"} Jan 27 11:39:19 crc kubenswrapper[4775]: I0127 11:39:19.761769 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerDied","Data":"5c3d79aab2eaf39741cf0a1a88cf8bdc2458d431fe6b12dc6778f596671b970c"} Jan 27 11:39:21 crc kubenswrapper[4775]: I0127 11:39:21.779264 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3ab198a-6671-407e-931d-e1e6dc109197" containerID="19dbb05fee4e0f091562b6f8390365f161f03f64f8035720d6e2c940618fe907" exitCode=0 Jan 27 11:39:21 crc kubenswrapper[4775]: I0127 11:39:21.779310 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerDied","Data":"19dbb05fee4e0f091562b6f8390365f161f03f64f8035720d6e2c940618fe907"} Jan 27 11:39:23 crc kubenswrapper[4775]: E0127 11:39:23.239500 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:23 crc kubenswrapper[4775]: E0127 11:39:23.241151 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:23 crc kubenswrapper[4775]: E0127 11:39:23.243029 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:23 crc kubenswrapper[4775]: E0127 11:39:23.243123 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:28 crc kubenswrapper[4775]: E0127 11:39:28.239685 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:28 crc kubenswrapper[4775]: E0127 11:39:28.241597 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:28 crc kubenswrapper[4775]: E0127 11:39:28.243019 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:28 crc kubenswrapper[4775]: E0127 11:39:28.243049 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:29 crc kubenswrapper[4775]: E0127 11:39:29.333531 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9862a859_ad75_4071_ad9a_ec926175e46d.slice/crio-conmon-5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.443689 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.448547 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.518490 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.518567 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.522276 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle\") pod \"31617f30-7431-401d-8c41-230d6a49ff72\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523077 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvqr2\" (UniqueName: \"kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2\") pod \"9862a859-ad75-4071-ad9a-ec926175e46d\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523117 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs\") pod \"9862a859-ad75-4071-ad9a-ec926175e46d\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523165 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle\") pod \"9862a859-ad75-4071-ad9a-ec926175e46d\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523258 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom\") pod \"9862a859-ad75-4071-ad9a-ec926175e46d\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523656 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom\") pod \"31617f30-7431-401d-8c41-230d6a49ff72\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523704 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs\") pod \"31617f30-7431-401d-8c41-230d6a49ff72\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523767 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data\") pod \"31617f30-7431-401d-8c41-230d6a49ff72\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523805 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data\") pod \"9862a859-ad75-4071-ad9a-ec926175e46d\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523853 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zn9s\" (UniqueName: \"kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s\") pod \"31617f30-7431-401d-8c41-230d6a49ff72\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523915 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs" (OuterVolumeSpecName: "logs") pod "9862a859-ad75-4071-ad9a-ec926175e46d" (UID: "9862a859-ad75-4071-ad9a-ec926175e46d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.524300 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.525155 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs" (OuterVolumeSpecName: "logs") pod "31617f30-7431-401d-8c41-230d6a49ff72" (UID: "31617f30-7431-401d-8c41-230d6a49ff72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.530644 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s" (OuterVolumeSpecName: "kube-api-access-4zn9s") pod "31617f30-7431-401d-8c41-230d6a49ff72" (UID: "31617f30-7431-401d-8c41-230d6a49ff72"). InnerVolumeSpecName "kube-api-access-4zn9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.532649 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "31617f30-7431-401d-8c41-230d6a49ff72" (UID: "31617f30-7431-401d-8c41-230d6a49ff72"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.541639 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2" (OuterVolumeSpecName: "kube-api-access-hvqr2") pod "9862a859-ad75-4071-ad9a-ec926175e46d" (UID: "9862a859-ad75-4071-ad9a-ec926175e46d"). InnerVolumeSpecName "kube-api-access-hvqr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.541716 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9862a859-ad75-4071-ad9a-ec926175e46d" (UID: "9862a859-ad75-4071-ad9a-ec926175e46d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.547138 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31617f30-7431-401d-8c41-230d6a49ff72" (UID: "31617f30-7431-401d-8c41-230d6a49ff72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.552381 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9862a859-ad75-4071-ad9a-ec926175e46d" (UID: "9862a859-ad75-4071-ad9a-ec926175e46d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.574291 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data" (OuterVolumeSpecName: "config-data") pod "31617f30-7431-401d-8c41-230d6a49ff72" (UID: "31617f30-7431-401d-8c41-230d6a49ff72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.580772 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data" (OuterVolumeSpecName: "config-data") pod "9862a859-ad75-4071-ad9a-ec926175e46d" (UID: "9862a859-ad75-4071-ad9a-ec926175e46d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626404 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626545 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626559 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626574 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626585 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626597 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zn9s\" (UniqueName: \"kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626610 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626620 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvqr2\" (UniqueName: \"kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626631 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.851606 4775 generic.go:334] "Generic (PLEG): container finished" podID="9862a859-ad75-4071-ad9a-ec926175e46d" containerID="5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba" exitCode=137 Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.851673 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerDied","Data":"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba"} Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.851766 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerDied","Data":"c2a8847ef3756637a0ac2e98b536e6dfeb366c6e1256763e5e2606e3b7895d3a"} Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.851814 4775 scope.go:117] "RemoveContainer" containerID="5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.851813 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.854518 4775 generic.go:334] "Generic (PLEG): container finished" podID="31617f30-7431-401d-8c41-230d6a49ff72" containerID="7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f" exitCode=137 Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.854577 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerDied","Data":"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f"} Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.854608 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerDied","Data":"94dd9f79f758d901a5bff17b96dea4bc02bd0921b66a706ae353879746b66d0f"} Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.854678 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.889733 4775 scope.go:117] "RemoveContainer" containerID="9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.903702 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.912375 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.912889 4775 scope.go:117] "RemoveContainer" containerID="5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba" Jan 27 11:39:29 crc kubenswrapper[4775]: E0127 11:39:29.913432 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba\": container with ID starting with 5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba not found: ID does not exist" containerID="5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.913539 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba"} err="failed to get container status \"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba\": rpc error: code = NotFound desc = could not find container \"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba\": container with ID starting with 5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba not found: ID does not exist" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.913576 4775 scope.go:117] "RemoveContainer" containerID="9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3" Jan 27 11:39:29 crc kubenswrapper[4775]: E0127 11:39:29.913990 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3\": container with ID starting with 9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3 not found: ID does not exist" containerID="9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.914192 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3"} err="failed to get container status \"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3\": rpc error: code = NotFound desc = could not find container \"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3\": container with ID starting with 9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3 not found: ID does not exist" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.914400 4775 scope.go:117] "RemoveContainer" containerID="7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.920887 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.928043 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.932973 4775 scope.go:117] "RemoveContainer" containerID="c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.956687 4775 scope.go:117] "RemoveContainer" containerID="7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f" Jan 27 11:39:29 crc kubenswrapper[4775]: E0127 11:39:29.957132 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f\": container with ID starting with 7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f not found: ID does not exist" containerID="7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.957178 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f"} err="failed to get container status \"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f\": rpc error: code = NotFound desc = could not find container \"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f\": container with ID starting with 7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f not found: ID does not exist" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.957198 4775 scope.go:117] "RemoveContainer" containerID="c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63" Jan 27 11:39:29 crc kubenswrapper[4775]: E0127 11:39:29.957616 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63\": container with ID starting with c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63 not found: ID does not exist" containerID="c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.957635 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63"} err="failed to get container status \"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63\": rpc error: code = NotFound desc = could not find container \"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63\": container with ID starting with c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63 not found: ID does not exist" Jan 27 11:39:31 crc kubenswrapper[4775]: I0127 11:39:31.757716 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31617f30-7431-401d-8c41-230d6a49ff72" path="/var/lib/kubelet/pods/31617f30-7431-401d-8c41-230d6a49ff72/volumes" Jan 27 11:39:31 crc kubenswrapper[4775]: I0127 11:39:31.758917 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" path="/var/lib/kubelet/pods/9862a859-ad75-4071-ad9a-ec926175e46d/volumes" Jan 27 11:39:33 crc kubenswrapper[4775]: E0127 11:39:33.238840 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:33 crc kubenswrapper[4775]: E0127 11:39:33.241407 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:33 crc kubenswrapper[4775]: E0127 11:39:33.242641 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:33 crc kubenswrapper[4775]: E0127 11:39:33.242772 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:34 crc kubenswrapper[4775]: I0127 11:39:34.925818 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:34 crc kubenswrapper[4775]: I0127 11:39:34.986273 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:39:34 crc kubenswrapper[4775]: I0127 11:39:34.987437 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f57cbf767-xvk7k" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-api" containerID="cri-o://0848da506d9d1e315e77e35c04fd69a834a63c3befc2e31f43e2dc6541968a23" gracePeriod=30 Jan 27 11:39:34 crc kubenswrapper[4775]: I0127 11:39:34.987625 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f57cbf767-xvk7k" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-httpd" containerID="cri-o://59aabef6148d4c27f5f6e5830e2db33d7bd3fb4d58f0d43a0d6775f307bccf5f" gracePeriod=30 Jan 27 11:39:35 crc kubenswrapper[4775]: I0127 11:39:35.914971 4775 generic.go:334] "Generic (PLEG): container finished" podID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerID="59aabef6148d4c27f5f6e5830e2db33d7bd3fb4d58f0d43a0d6775f307bccf5f" exitCode=0 Jan 27 11:39:35 crc kubenswrapper[4775]: I0127 11:39:35.915077 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerDied","Data":"59aabef6148d4c27f5f6e5830e2db33d7bd3fb4d58f0d43a0d6775f307bccf5f"} Jan 27 11:39:36 crc kubenswrapper[4775]: I0127 11:39:36.925234 4775 generic.go:334] "Generic (PLEG): container finished" podID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerID="0848da506d9d1e315e77e35c04fd69a834a63c3befc2e31f43e2dc6541968a23" exitCode=0 Jan 27 11:39:36 crc kubenswrapper[4775]: I0127 11:39:36.925418 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerDied","Data":"0848da506d9d1e315e77e35c04fd69a834a63c3befc2e31f43e2dc6541968a23"} Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.143355 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274320 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rp82\" (UniqueName: \"kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274336 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274700 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274840 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274886 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.280247 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.282175 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82" (OuterVolumeSpecName: "kube-api-access-2rp82") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "kube-api-access-2rp82". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.334623 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.334674 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.335511 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config" (OuterVolumeSpecName: "config") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.337439 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.355182 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.376941 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.376982 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.376995 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.377005 4775 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.377017 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rp82\" (UniqueName: \"kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.377031 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.377042 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.935482 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerDied","Data":"53a128ffc6e310fa157dfd37a105cff396b2195c605357ef2976ef48f28caaf9"} Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.936903 4775 scope.go:117] "RemoveContainer" containerID="59aabef6148d4c27f5f6e5830e2db33d7bd3fb4d58f0d43a0d6775f307bccf5f" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.935719 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.968804 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.971532 4775 scope.go:117] "RemoveContainer" containerID="0848da506d9d1e315e77e35c04fd69a834a63c3befc2e31f43e2dc6541968a23" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.977675 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:39:38 crc kubenswrapper[4775]: E0127 11:39:38.240631 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:38 crc kubenswrapper[4775]: E0127 11:39:38.242943 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:38 crc kubenswrapper[4775]: E0127 11:39:38.244199 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:38 crc kubenswrapper[4775]: E0127 11:39:38.244239 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:39 crc kubenswrapper[4775]: I0127 11:39:39.756052 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" path="/var/lib/kubelet/pods/17e205ad-6676-4f5d-b9d0-0d8c958d815d/volumes" Jan 27 11:39:43 crc kubenswrapper[4775]: I0127 11:39:43.230947 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 27 11:39:43 crc kubenswrapper[4775]: E0127 11:39:43.239908 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:43 crc kubenswrapper[4775]: E0127 11:39:43.241668 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:43 crc kubenswrapper[4775]: E0127 11:39:43.244353 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:43 crc kubenswrapper[4775]: E0127 11:39:43.244470 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:44 crc kubenswrapper[4775]: I0127 11:39:44.996568 4775 generic.go:334] "Generic (PLEG): container finished" podID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" exitCode=137 Jan 27 11:39:44 crc kubenswrapper[4775]: I0127 11:39:44.996808 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9","Type":"ContainerDied","Data":"cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6"} Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.112593 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.211763 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhrxz\" (UniqueName: \"kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz\") pod \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.211819 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle\") pod \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.211848 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data\") pod \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.217540 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz" (OuterVolumeSpecName: "kube-api-access-mhrxz") pod "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" (UID: "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9"). InnerVolumeSpecName "kube-api-access-mhrxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.242055 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data" (OuterVolumeSpecName: "config-data") pod "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" (UID: "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.246411 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" (UID: "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.314060 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhrxz\" (UniqueName: \"kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.314101 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.314116 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.006539 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9","Type":"ContainerDied","Data":"82d4d61311885172aa8b3e5cc80375eb709a13d1d92b08eb5c2530bda351308b"} Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.006588 4775 scope.go:117] "RemoveContainer" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.006709 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.033351 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.047913 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065400 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065784 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker-log" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065803 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker-log" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065824 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065830 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065841 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener-log" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065847 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener-log" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065859 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-api" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065864 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-api" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065877 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-httpd" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065882 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-httpd" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065899 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065905 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065920 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065926 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066093 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066108 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066120 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker-log" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066137 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066146 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener-log" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066159 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-api" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066174 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-httpd" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066717 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.076659 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.079332 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.079583 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kp5gz" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.135867 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.135934 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.136395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz4nd\" (UniqueName: \"kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.237821 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.238224 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.238419 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz4nd\" (UniqueName: \"kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.249198 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.252135 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.254941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz4nd\" (UniqueName: \"kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.393349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.549606 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.553428 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.633889 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.634408 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b9b59fc66-t6rbl" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-log" containerID="cri-o://c8e562dcd249e68b0060406f3b2394c8239c0b9654b1e64e4b6a4b3e8e23ca84" gracePeriod=30 Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.634877 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b9b59fc66-t6rbl" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-api" containerID="cri-o://174033676be0775ea3975296e01fba15ad5de44d5394f6325f82a1a3f89deda7" gracePeriod=30 Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.878129 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:46 crc kubenswrapper[4775]: W0127 11:39:46.882683 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7617063e_fa32_45fc_b06e_7ecff629f7db.slice/crio-7366e2b06b8dfe620b743759b8a53259302cbfecadc69c376be4bc38237a72e8 WatchSource:0}: Error finding container 7366e2b06b8dfe620b743759b8a53259302cbfecadc69c376be4bc38237a72e8: Status 404 returned error can't find the container with id 7366e2b06b8dfe620b743759b8a53259302cbfecadc69c376be4bc38237a72e8 Jan 27 11:39:47 crc kubenswrapper[4775]: I0127 11:39:47.016550 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7617063e-fa32-45fc-b06e-7ecff629f7db","Type":"ContainerStarted","Data":"7366e2b06b8dfe620b743759b8a53259302cbfecadc69c376be4bc38237a72e8"} Jan 27 11:39:47 crc kubenswrapper[4775]: I0127 11:39:47.019025 4775 generic.go:334] "Generic (PLEG): container finished" podID="926c665f-b922-4372-85aa-bbe29399eaac" containerID="c8e562dcd249e68b0060406f3b2394c8239c0b9654b1e64e4b6a4b3e8e23ca84" exitCode=143 Jan 27 11:39:47 crc kubenswrapper[4775]: I0127 11:39:47.019095 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerDied","Data":"c8e562dcd249e68b0060406f3b2394c8239c0b9654b1e64e4b6a4b3e8e23ca84"} Jan 27 11:39:47 crc kubenswrapper[4775]: I0127 11:39:47.754599 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" path="/var/lib/kubelet/pods/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9/volumes" Jan 27 11:39:48 crc kubenswrapper[4775]: I0127 11:39:48.029409 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7617063e-fa32-45fc-b06e-7ecff629f7db","Type":"ContainerStarted","Data":"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7"} Jan 27 11:39:48 crc kubenswrapper[4775]: I0127 11:39:48.030603 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:48 crc kubenswrapper[4775]: I0127 11:39:48.055297 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.055275666 podStartE2EDuration="2.055275666s" podCreationTimestamp="2026-01-27 11:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:48.045464495 +0000 UTC m=+1167.187062282" watchObservedRunningTime="2026-01-27 11:39:48.055275666 +0000 UTC m=+1167.196873443" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.041985 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3ab198a-6671-407e-931d-e1e6dc109197" containerID="87c0c670f987fb5b699e39f1152f819ebcf54f73b798b5259ff6a7b344f01fb9" exitCode=137 Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.042082 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerDied","Data":"87c0c670f987fb5b699e39f1152f819ebcf54f73b798b5259ff6a7b344f01fb9"} Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.315334 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494168 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494267 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494291 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtljn\" (UniqueName: \"kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494405 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494433 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.495833 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.495956 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.500032 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn" (OuterVolumeSpecName: "kube-api-access-rtljn") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "kube-api-access-rtljn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.500726 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts" (OuterVolumeSpecName: "scripts") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.533400 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.577646 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598344 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598389 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598401 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598411 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598424 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598433 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtljn\" (UniqueName: \"kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.616484 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data" (OuterVolumeSpecName: "config-data") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.701014 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.055756 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerDied","Data":"2a2ee9ecd020ed63d838c367608617b5c5b9bef053fb9d27e529ac66f6e55c5a"} Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.056125 4775 scope.go:117] "RemoveContainer" containerID="87c0c670f987fb5b699e39f1152f819ebcf54f73b798b5259ff6a7b344f01fb9" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.055857 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.061758 4775 generic.go:334] "Generic (PLEG): container finished" podID="926c665f-b922-4372-85aa-bbe29399eaac" containerID="174033676be0775ea3975296e01fba15ad5de44d5394f6325f82a1a3f89deda7" exitCode=0 Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.061905 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerDied","Data":"174033676be0775ea3975296e01fba15ad5de44d5394f6325f82a1a3f89deda7"} Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.082921 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.088153 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109027 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109068 4775 scope.go:117] "RemoveContainer" containerID="b7f67772ea6767fe5e5ebb612038b7900a441fee4eef11de26a544a863c1564c" Jan 27 11:39:50 crc kubenswrapper[4775]: E0127 11:39:50.109538 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="sg-core" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109555 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="sg-core" Jan 27 11:39:50 crc kubenswrapper[4775]: E0127 11:39:50.109574 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-notification-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109581 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-notification-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: E0127 11:39:50.109606 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="proxy-httpd" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109614 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="proxy-httpd" Jan 27 11:39:50 crc kubenswrapper[4775]: E0127 11:39:50.109641 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-central-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109649 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-central-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109844 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-notification-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109860 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="proxy-httpd" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109871 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-central-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109894 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="sg-core" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.111543 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.117850 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.147323 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.151118 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.193902 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.196233 4775 scope.go:117] "RemoveContainer" containerID="5c3d79aab2eaf39741cf0a1a88cf8bdc2458d431fe6b12dc6778f596671b970c" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212394 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212463 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212502 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212521 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212621 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vhv\" (UniqueName: \"kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212716 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212762 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.217238 4775 scope.go:117] "RemoveContainer" containerID="19dbb05fee4e0f091562b6f8390365f161f03f64f8035720d6e2c940618fe907" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314021 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314082 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jspwx\" (UniqueName: \"kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314116 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314152 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314239 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314263 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314440 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314494 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314542 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314566 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314593 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314611 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314646 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vhv\" (UniqueName: \"kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.315470 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs" (OuterVolumeSpecName: "logs") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.316124 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.317136 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.319634 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx" (OuterVolumeSpecName: "kube-api-access-jspwx") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "kube-api-access-jspwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.321023 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts" (OuterVolumeSpecName: "scripts") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.321871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.323595 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.324234 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.327524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.333650 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vhv\" (UniqueName: \"kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.378561 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data" (OuterVolumeSpecName: "config-data") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.379198 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.415440 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.415489 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.415499 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.415511 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jspwx\" (UniqueName: \"kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.415523 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.416635 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.422425 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.506601 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.517856 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.517895 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.926321 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:50 crc kubenswrapper[4775]: W0127 11:39:50.929035 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod877bcef1_579c_413c_a0c0_6dad63885091.slice/crio-eeb5d6eb3865672e5d710d66ff273bcee9e0b5353cef376cf3d7740ea7501229 WatchSource:0}: Error finding container eeb5d6eb3865672e5d710d66ff273bcee9e0b5353cef376cf3d7740ea7501229: Status 404 returned error can't find the container with id eeb5d6eb3865672e5d710d66ff273bcee9e0b5353cef376cf3d7740ea7501229 Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.073180 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.073233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerDied","Data":"0fb58f98d42cc735e9a9f8ee52d9b3e8b27d110f1502a0148df7a0c3e74615b7"} Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.073297 4775 scope.go:117] "RemoveContainer" containerID="174033676be0775ea3975296e01fba15ad5de44d5394f6325f82a1a3f89deda7" Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.074302 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerStarted","Data":"eeb5d6eb3865672e5d710d66ff273bcee9e0b5353cef376cf3d7740ea7501229"} Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.112086 4775 scope.go:117] "RemoveContainer" containerID="c8e562dcd249e68b0060406f3b2394c8239c0b9654b1e64e4b6a4b3e8e23ca84" Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.112234 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.120390 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.788728 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="926c665f-b922-4372-85aa-bbe29399eaac" path="/var/lib/kubelet/pods/926c665f-b922-4372-85aa-bbe29399eaac/volumes" Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.790140 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" path="/var/lib/kubelet/pods/f3ab198a-6671-407e-931d-e1e6dc109197/volumes" Jan 27 11:39:52 crc kubenswrapper[4775]: I0127 11:39:52.083643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerStarted","Data":"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2"} Jan 27 11:39:53 crc kubenswrapper[4775]: I0127 11:39:53.108781 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerStarted","Data":"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005"} Jan 27 11:39:54 crc kubenswrapper[4775]: I0127 11:39:54.120333 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerStarted","Data":"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d"} Jan 27 11:39:56 crc kubenswrapper[4775]: I0127 11:39:56.142071 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerStarted","Data":"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261"} Jan 27 11:39:56 crc kubenswrapper[4775]: I0127 11:39:56.142895 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:39:56 crc kubenswrapper[4775]: I0127 11:39:56.170117 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.075568595 podStartE2EDuration="6.170095728s" podCreationTimestamp="2026-01-27 11:39:50 +0000 UTC" firstStartedPulling="2026-01-27 11:39:50.930959871 +0000 UTC m=+1170.072557648" lastFinishedPulling="2026-01-27 11:39:55.025487004 +0000 UTC m=+1174.167084781" observedRunningTime="2026-01-27 11:39:56.168322928 +0000 UTC m=+1175.309920705" watchObservedRunningTime="2026-01-27 11:39:56.170095728 +0000 UTC m=+1175.311693515" Jan 27 11:39:56 crc kubenswrapper[4775]: I0127 11:39:56.421631 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.121871 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-m2t9b"] Jan 27 11:39:57 crc kubenswrapper[4775]: E0127 11:39:57.122679 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-log" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.122701 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-log" Jan 27 11:39:57 crc kubenswrapper[4775]: E0127 11:39:57.122726 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-api" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.122737 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-api" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.122980 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-log" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.123004 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-api" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.123763 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.129219 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.129875 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.172797 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m2t9b"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.241373 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.241459 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.241535 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m26fc\" (UniqueName: \"kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.241625 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.287185 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.288677 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.294481 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.330169 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.342887 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.342933 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zjnb\" (UniqueName: \"kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.342985 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.343010 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.343044 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.343078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m26fc\" (UniqueName: \"kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.343139 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.343154 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.351080 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.351819 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.358254 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.364413 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.365527 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.370409 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.370963 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m26fc\" (UniqueName: \"kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.385492 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448311 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448363 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448402 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zjnb\" (UniqueName: \"kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448489 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448540 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448566 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448660 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmdhc\" (UniqueName: \"kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448870 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.453532 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.460306 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.467742 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.469164 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.483055 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.484253 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.518059 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.528196 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zjnb\" (UniqueName: \"kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552512 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552585 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552606 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552690 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552715 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmdhc\" (UniqueName: \"kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552750 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95v4f\" (UniqueName: \"kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.572127 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.577407 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.593195 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmdhc\" (UniqueName: \"kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.602780 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.603986 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.608098 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.608244 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.610223 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.621536 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.626071 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.628219 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654662 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654775 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95v4f\" (UniqueName: \"kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654799 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654817 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8nj\" (UniqueName: \"kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654844 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654906 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.657322 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.662914 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.673068 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.681072 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95v4f\" (UniqueName: \"kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756591 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756668 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756690 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8nj\" (UniqueName: \"kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756716 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw444\" (UniqueName: \"kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756735 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756768 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756824 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756851 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.763661 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.766406 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.788522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8nj\" (UniqueName: \"kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861046 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861108 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861192 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861287 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw444\" (UniqueName: \"kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861324 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861382 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.862074 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.862171 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.864259 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.864791 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.867316 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.884015 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.888412 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw444\" (UniqueName: \"kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.911893 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.944559 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.957833 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.145157 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m2t9b"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.179721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m2t9b" event={"ID":"8726531a-a74e-48cd-a274-6f67ae507560","Type":"ContainerStarted","Data":"50506c1daff83db792e938bca2854ddaabea300dce60ee38949fcee7261dbf7c"} Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.248538 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xh4b2"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.250308 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.253993 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.254192 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.303322 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xh4b2"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.342205 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.378423 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.378748 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.379055 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndwhl\" (UniqueName: \"kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.379170 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.453007 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.481402 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndwhl\" (UniqueName: \"kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.481953 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.482058 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.482153 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.487643 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.489704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.490332 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.499141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndwhl\" (UniqueName: \"kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.616099 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.632509 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.712297 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:39:58 crc kubenswrapper[4775]: W0127 11:39:58.726826 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode869df3d_c15b_4610_bb78_00ad49940d17.slice/crio-52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e WatchSource:0}: Error finding container 52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e: Status 404 returned error can't find the container with id 52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.735779 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.191131 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerStarted","Data":"234a0c458af079345c8244f18b5988ff4056e6dc102c0dcac07740fc8e5eeb55"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.200318 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8","Type":"ContainerStarted","Data":"ed92b0535d28c0e558eeedf3ab4bfde4b43bbb5e6bbdcef58e08c4e58984f177"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.203800 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerStarted","Data":"0a7460a95945a93f0c4a50f297f4b7fe68e0f3ea9e0d32b93ec9b5db49741c68"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.203864 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerStarted","Data":"52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.212546 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerStarted","Data":"589962a3449aa5fdeff7c986358683ffe6f8f1614193eefd4398944d9fb0173e"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.229516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m2t9b" event={"ID":"8726531a-a74e-48cd-a274-6f67ae507560","Type":"ContainerStarted","Data":"7a301f6fdbdbc7fba26fdec2032cb9599d38e17acf3b3627d4e654dc3bc0fdb7"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.244375 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0c2fced-9c0a-4cef-90ed-d6429ee82751","Type":"ContainerStarted","Data":"6cc0b4e0ee3ee3d1c37a3c99f72cc692ec43b7950501d86f96aae8aad30b9516"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.253565 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xh4b2"] Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.253823 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-m2t9b" podStartSLOduration=2.253808133 podStartE2EDuration="2.253808133s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:59.244716361 +0000 UTC m=+1178.386314138" watchObservedRunningTime="2026-01-27 11:39:59.253808133 +0000 UTC m=+1178.395405910" Jan 27 11:39:59 crc kubenswrapper[4775]: W0127 11:39:59.254723 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3942760_c6b4_43b5_9680_48d8b8ae3854.slice/crio-3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386 WatchSource:0}: Error finding container 3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386: Status 404 returned error can't find the container with id 3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386 Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.517502 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.517556 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.517597 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.518249 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.518300 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7" gracePeriod=600 Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.271393 4775 generic.go:334] "Generic (PLEG): container finished" podID="e869df3d-c15b-4610-bb78-00ad49940d17" containerID="0a7460a95945a93f0c4a50f297f4b7fe68e0f3ea9e0d32b93ec9b5db49741c68" exitCode=0 Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.272102 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerDied","Data":"0a7460a95945a93f0c4a50f297f4b7fe68e0f3ea9e0d32b93ec9b5db49741c68"} Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.272154 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerStarted","Data":"ada66549c4f1e296080bb921b685b5ff52027670033c232a5715f71a31d45760"} Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.272256 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.276666 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7" exitCode=0 Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.276762 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7"} Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.276846 4775 scope.go:117] "RemoveContainer" containerID="d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310" Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.279274 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" event={"ID":"a3942760-c6b4-43b5-9680-48d8b8ae3854","Type":"ContainerStarted","Data":"b754699b4de85074b5e141a6f2ae8704aa4f96f92dca88cac7a93ee7f041781e"} Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.279313 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" event={"ID":"a3942760-c6b4-43b5-9680-48d8b8ae3854","Type":"ContainerStarted","Data":"3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386"} Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.309385 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" podStartSLOduration=3.30936532 podStartE2EDuration="3.30936532s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:00.292149434 +0000 UTC m=+1179.433747231" watchObservedRunningTime="2026-01-27 11:40:00.30936532 +0000 UTC m=+1179.450963087" Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.319121 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" podStartSLOduration=2.318897084 podStartE2EDuration="2.318897084s" podCreationTimestamp="2026-01-27 11:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:00.310290166 +0000 UTC m=+1179.451887943" watchObservedRunningTime="2026-01-27 11:40:00.318897084 +0000 UTC m=+1179.460494851" Jan 27 11:40:01 crc kubenswrapper[4775]: I0127 11:40:01.461512 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:01 crc kubenswrapper[4775]: I0127 11:40:01.472655 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.301818 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerStarted","Data":"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.302398 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerStarted","Data":"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.303790 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8","Type":"ContainerStarted","Data":"5f66195a27d4424e7e63c73f2e82e91d3646c082443a037a0bda03b3cefa73cf"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.303919 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5f66195a27d4424e7e63c73f2e82e91d3646c082443a037a0bda03b3cefa73cf" gracePeriod=30 Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.313865 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerStarted","Data":"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.313915 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerStarted","Data":"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.314034 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-log" containerID="cri-o://6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" gracePeriod=30 Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.314174 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-metadata" containerID="cri-o://e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" gracePeriod=30 Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.319194 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.321441 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0c2fced-9c0a-4cef-90ed-d6429ee82751","Type":"ContainerStarted","Data":"c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.374639 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.360300912 podStartE2EDuration="5.374617215s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="2026-01-27 11:39:58.312123629 +0000 UTC m=+1177.453721406" lastFinishedPulling="2026-01-27 11:40:01.326439932 +0000 UTC m=+1180.468037709" observedRunningTime="2026-01-27 11:40:02.333026993 +0000 UTC m=+1181.474624770" watchObservedRunningTime="2026-01-27 11:40:02.374617215 +0000 UTC m=+1181.516214992" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.375182 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.515363535 podStartE2EDuration="5.37517696s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="2026-01-27 11:39:58.46349171 +0000 UTC m=+1177.605089487" lastFinishedPulling="2026-01-27 11:40:01.323305135 +0000 UTC m=+1180.464902912" observedRunningTime="2026-01-27 11:40:02.36794337 +0000 UTC m=+1181.509541147" watchObservedRunningTime="2026-01-27 11:40:02.37517696 +0000 UTC m=+1181.516774737" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.401983 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.690785042 podStartE2EDuration="5.401970423s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="2026-01-27 11:39:58.621169795 +0000 UTC m=+1177.762767572" lastFinishedPulling="2026-01-27 11:40:01.332355176 +0000 UTC m=+1180.473952953" observedRunningTime="2026-01-27 11:40:02.400614036 +0000 UTC m=+1181.542211823" watchObservedRunningTime="2026-01-27 11:40:02.401970423 +0000 UTC m=+1181.543568200" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.431581 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.896297942 podStartE2EDuration="5.431559492s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="2026-01-27 11:39:58.798591308 +0000 UTC m=+1177.940189085" lastFinishedPulling="2026-01-27 11:40:01.333852858 +0000 UTC m=+1180.475450635" observedRunningTime="2026-01-27 11:40:02.426839751 +0000 UTC m=+1181.568437528" watchObservedRunningTime="2026-01-27 11:40:02.431559492 +0000 UTC m=+1181.573157269" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.882964 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.885087 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.945740 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.982626 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs\") pod \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.982740 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle\") pod \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.982778 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95v4f\" (UniqueName: \"kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f\") pod \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.982820 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data\") pod \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.983044 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs" (OuterVolumeSpecName: "logs") pod "c27adc3b-07b5-457f-96f5-cfffea2e34b8" (UID: "c27adc3b-07b5-457f-96f5-cfffea2e34b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.983267 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.988987 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f" (OuterVolumeSpecName: "kube-api-access-95v4f") pod "c27adc3b-07b5-457f-96f5-cfffea2e34b8" (UID: "c27adc3b-07b5-457f-96f5-cfffea2e34b8"). InnerVolumeSpecName "kube-api-access-95v4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.012722 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data" (OuterVolumeSpecName: "config-data") pod "c27adc3b-07b5-457f-96f5-cfffea2e34b8" (UID: "c27adc3b-07b5-457f-96f5-cfffea2e34b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.015621 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c27adc3b-07b5-457f-96f5-cfffea2e34b8" (UID: "c27adc3b-07b5-457f-96f5-cfffea2e34b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.084546 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.084857 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95v4f\" (UniqueName: \"kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.084867 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330522 4775 generic.go:334] "Generic (PLEG): container finished" podID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerID="e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" exitCode=0 Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330568 4775 generic.go:334] "Generic (PLEG): container finished" podID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerID="6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" exitCode=143 Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330580 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330628 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerDied","Data":"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05"} Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerDied","Data":"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d"} Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330741 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerDied","Data":"589962a3449aa5fdeff7c986358683ffe6f8f1614193eefd4398944d9fb0173e"} Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330761 4775 scope.go:117] "RemoveContainer" containerID="e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.415170 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.419888 4775 scope.go:117] "RemoveContainer" containerID="6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.439481 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.464303 4775 scope.go:117] "RemoveContainer" containerID="e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" Jan 27 11:40:03 crc kubenswrapper[4775]: E0127 11:40:03.464804 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05\": container with ID starting with e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05 not found: ID does not exist" containerID="e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.464866 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05"} err="failed to get container status \"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05\": rpc error: code = NotFound desc = could not find container \"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05\": container with ID starting with e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05 not found: ID does not exist" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.464896 4775 scope.go:117] "RemoveContainer" containerID="6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" Jan 27 11:40:03 crc kubenswrapper[4775]: E0127 11:40:03.465181 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d\": container with ID starting with 6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d not found: ID does not exist" containerID="6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.465202 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d"} err="failed to get container status \"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d\": rpc error: code = NotFound desc = could not find container \"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d\": container with ID starting with 6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d not found: ID does not exist" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.465216 4775 scope.go:117] "RemoveContainer" containerID="e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.465425 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05"} err="failed to get container status \"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05\": rpc error: code = NotFound desc = could not find container \"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05\": container with ID starting with e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05 not found: ID does not exist" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.465459 4775 scope.go:117] "RemoveContainer" containerID="6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.466823 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d"} err="failed to get container status \"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d\": rpc error: code = NotFound desc = could not find container \"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d\": container with ID starting with 6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d not found: ID does not exist" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.468403 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:03 crc kubenswrapper[4775]: E0127 11:40:03.468855 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-log" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.468872 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-log" Jan 27 11:40:03 crc kubenswrapper[4775]: E0127 11:40:03.468902 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-metadata" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.468908 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-metadata" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.469179 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-log" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.469199 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-metadata" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.470710 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.476285 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.476519 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.479994 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.595649 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxq6w\" (UniqueName: \"kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.596058 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.596092 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.596179 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.596219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.698362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxq6w\" (UniqueName: \"kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.698475 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.698519 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.698603 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.698640 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.699966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.705242 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.705556 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.707139 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.722953 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxq6w\" (UniqueName: \"kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.760863 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" path="/var/lib/kubelet/pods/c27adc3b-07b5-457f-96f5-cfffea2e34b8/volumes" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.791046 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:04 crc kubenswrapper[4775]: I0127 11:40:04.305888 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:04 crc kubenswrapper[4775]: I0127 11:40:04.340360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerStarted","Data":"7a575331b653caf550f94723cfa812dc6bcd3d3fd0557ee1db29564218b576be"} Jan 27 11:40:05 crc kubenswrapper[4775]: I0127 11:40:05.350366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerStarted","Data":"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33"} Jan 27 11:40:05 crc kubenswrapper[4775]: I0127 11:40:05.350908 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerStarted","Data":"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114"} Jan 27 11:40:05 crc kubenswrapper[4775]: I0127 11:40:05.374595 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.374581562 podStartE2EDuration="2.374581562s" podCreationTimestamp="2026-01-27 11:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:05.37161864 +0000 UTC m=+1184.513216417" watchObservedRunningTime="2026-01-27 11:40:05.374581562 +0000 UTC m=+1184.516179339" Jan 27 11:40:06 crc kubenswrapper[4775]: I0127 11:40:06.359854 4775 generic.go:334] "Generic (PLEG): container finished" podID="8726531a-a74e-48cd-a274-6f67ae507560" containerID="7a301f6fdbdbc7fba26fdec2032cb9599d38e17acf3b3627d4e654dc3bc0fdb7" exitCode=0 Jan 27 11:40:06 crc kubenswrapper[4775]: I0127 11:40:06.359914 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m2t9b" event={"ID":"8726531a-a74e-48cd-a274-6f67ae507560","Type":"ContainerDied","Data":"7a301f6fdbdbc7fba26fdec2032cb9599d38e17acf3b3627d4e654dc3bc0fdb7"} Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.370531 4775 generic.go:334] "Generic (PLEG): container finished" podID="a3942760-c6b4-43b5-9680-48d8b8ae3854" containerID="b754699b4de85074b5e141a6f2ae8704aa4f96f92dca88cac7a93ee7f041781e" exitCode=0 Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.370643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" event={"ID":"a3942760-c6b4-43b5-9680-48d8b8ae3854","Type":"ContainerDied","Data":"b754699b4de85074b5e141a6f2ae8704aa4f96f92dca88cac7a93ee7f041781e"} Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.609410 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.609491 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.749240 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.880150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data\") pod \"8726531a-a74e-48cd-a274-6f67ae507560\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.880277 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts\") pod \"8726531a-a74e-48cd-a274-6f67ae507560\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.880328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m26fc\" (UniqueName: \"kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc\") pod \"8726531a-a74e-48cd-a274-6f67ae507560\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.880553 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle\") pod \"8726531a-a74e-48cd-a274-6f67ae507560\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.886719 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts" (OuterVolumeSpecName: "scripts") pod "8726531a-a74e-48cd-a274-6f67ae507560" (UID: "8726531a-a74e-48cd-a274-6f67ae507560"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.888165 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc" (OuterVolumeSpecName: "kube-api-access-m26fc") pod "8726531a-a74e-48cd-a274-6f67ae507560" (UID: "8726531a-a74e-48cd-a274-6f67ae507560"). InnerVolumeSpecName "kube-api-access-m26fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.913294 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data" (OuterVolumeSpecName: "config-data") pod "8726531a-a74e-48cd-a274-6f67ae507560" (UID: "8726531a-a74e-48cd-a274-6f67ae507560"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.917920 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8726531a-a74e-48cd-a274-6f67ae507560" (UID: "8726531a-a74e-48cd-a274-6f67ae507560"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.945273 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.960373 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.980436 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.983022 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.983057 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m26fc\" (UniqueName: \"kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.983068 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.983077 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.080653 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.080979 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="dnsmasq-dns" containerID="cri-o://a0e92df054ede73072c8816014c71d3028937fc797e7a11e419afbd459f2f615" gracePeriod=10 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.385706 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m2t9b" event={"ID":"8726531a-a74e-48cd-a274-6f67ae507560","Type":"ContainerDied","Data":"50506c1daff83db792e938bca2854ddaabea300dce60ee38949fcee7261dbf7c"} Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.385754 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50506c1daff83db792e938bca2854ddaabea300dce60ee38949fcee7261dbf7c" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.385829 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.394026 4775 generic.go:334] "Generic (PLEG): container finished" podID="91668934-529e-4df9-b41f-8cd54e5920ea" containerID="a0e92df054ede73072c8816014c71d3028937fc797e7a11e419afbd459f2f615" exitCode=0 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.395030 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" event={"ID":"91668934-529e-4df9-b41f-8cd54e5920ea","Type":"ContainerDied","Data":"a0e92df054ede73072c8816014c71d3028937fc797e7a11e419afbd459f2f615"} Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.448666 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.551251 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.609709 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.623671 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.624050 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-log" containerID="cri-o://12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8" gracePeriod=30 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.624608 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-api" containerID="cri-o://a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d" gracePeriod=30 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.653258 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.663607 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.663806 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-log" containerID="cri-o://b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" gracePeriod=30 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.664222 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-metadata" containerID="cri-o://408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" gracePeriod=30 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696728 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6lcx\" (UniqueName: \"kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696784 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696842 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696860 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696889 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.730752 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx" (OuterVolumeSpecName: "kube-api-access-q6lcx") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "kube-api-access-q6lcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.785222 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.785233 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.791687 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.791767 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.801393 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.801432 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6lcx\" (UniqueName: \"kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.801464 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.805005 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.808641 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.829140 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config" (OuterVolumeSpecName: "config") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.902783 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.902833 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.902850 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.973442 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.078361 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.106090 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndwhl\" (UniqueName: \"kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl\") pod \"a3942760-c6b4-43b5-9680-48d8b8ae3854\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.106152 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts\") pod \"a3942760-c6b4-43b5-9680-48d8b8ae3854\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.106945 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle\") pod \"a3942760-c6b4-43b5-9680-48d8b8ae3854\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.107237 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data\") pod \"a3942760-c6b4-43b5-9680-48d8b8ae3854\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.110233 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl" (OuterVolumeSpecName: "kube-api-access-ndwhl") pod "a3942760-c6b4-43b5-9680-48d8b8ae3854" (UID: "a3942760-c6b4-43b5-9680-48d8b8ae3854"). InnerVolumeSpecName "kube-api-access-ndwhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.110732 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts" (OuterVolumeSpecName: "scripts") pod "a3942760-c6b4-43b5-9680-48d8b8ae3854" (UID: "a3942760-c6b4-43b5-9680-48d8b8ae3854"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.133669 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data" (OuterVolumeSpecName: "config-data") pod "a3942760-c6b4-43b5-9680-48d8b8ae3854" (UID: "a3942760-c6b4-43b5-9680-48d8b8ae3854"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.134949 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3942760-c6b4-43b5-9680-48d8b8ae3854" (UID: "a3942760-c6b4-43b5-9680-48d8b8ae3854"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.195695 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.209212 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndwhl\" (UniqueName: \"kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.209246 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.209260 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.209271 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.310150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle\") pod \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.310549 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxq6w\" (UniqueName: \"kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w\") pod \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.310659 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data\") pod \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.310702 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs\") pod \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.310755 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs\") pod \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.311029 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs" (OuterVolumeSpecName: "logs") pod "6a0f23e6-5732-4337-b5fa-d433e99f5cb1" (UID: "6a0f23e6-5732-4337-b5fa-d433e99f5cb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.311591 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.314859 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w" (OuterVolumeSpecName: "kube-api-access-vxq6w") pod "6a0f23e6-5732-4337-b5fa-d433e99f5cb1" (UID: "6a0f23e6-5732-4337-b5fa-d433e99f5cb1"). InnerVolumeSpecName "kube-api-access-vxq6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.334224 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data" (OuterVolumeSpecName: "config-data") pod "6a0f23e6-5732-4337-b5fa-d433e99f5cb1" (UID: "6a0f23e6-5732-4337-b5fa-d433e99f5cb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.336285 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a0f23e6-5732-4337-b5fa-d433e99f5cb1" (UID: "6a0f23e6-5732-4337-b5fa-d433e99f5cb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.358070 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6a0f23e6-5732-4337-b5fa-d433e99f5cb1" (UID: "6a0f23e6-5732-4337-b5fa-d433e99f5cb1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405632 4775 generic.go:334] "Generic (PLEG): container finished" podID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerID="408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" exitCode=0 Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405670 4775 generic.go:334] "Generic (PLEG): container finished" podID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerID="b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" exitCode=143 Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405759 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405779 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerDied","Data":"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405829 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerDied","Data":"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405844 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerDied","Data":"7a575331b653caf550f94723cfa812dc6bcd3d3fd0557ee1db29564218b576be"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405863 4775 scope.go:117] "RemoveContainer" containerID="408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.410691 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" event={"ID":"a3942760-c6b4-43b5-9680-48d8b8ae3854","Type":"ContainerDied","Data":"3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.410736 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.410821 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.419510 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.419538 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxq6w\" (UniqueName: \"kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.419553 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.419563 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.426339 4775 generic.go:334] "Generic (PLEG): container finished" podID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerID="12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8" exitCode=143 Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.426520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerDied","Data":"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.446532 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.449493 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" event={"ID":"91668934-529e-4df9-b41f-8cd54e5920ea","Type":"ContainerDied","Data":"98a47029353e8ac81c34e8a77e13a6ae144436ae57c8cc4cc8ecca40c93dad8a"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.453480 4775 scope.go:117] "RemoveContainer" containerID="b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.508251 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.518771 4775 scope.go:117] "RemoveContainer" containerID="408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.520147 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33\": container with ID starting with 408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33 not found: ID does not exist" containerID="408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.520202 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33"} err="failed to get container status \"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33\": rpc error: code = NotFound desc = could not find container \"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33\": container with ID starting with 408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33 not found: ID does not exist" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.520236 4775 scope.go:117] "RemoveContainer" containerID="b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.520579 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114\": container with ID starting with b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114 not found: ID does not exist" containerID="b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.520611 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114"} err="failed to get container status \"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114\": rpc error: code = NotFound desc = could not find container \"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114\": container with ID starting with b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114 not found: ID does not exist" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.520629 4775 scope.go:117] "RemoveContainer" containerID="408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.520986 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33"} err="failed to get container status \"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33\": rpc error: code = NotFound desc = could not find container \"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33\": container with ID starting with 408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33 not found: ID does not exist" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.521007 4775 scope.go:117] "RemoveContainer" containerID="b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.521186 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114"} err="failed to get container status \"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114\": rpc error: code = NotFound desc = could not find container \"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114\": container with ID starting with b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114 not found: ID does not exist" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.521204 4775 scope.go:117] "RemoveContainer" containerID="a0e92df054ede73072c8816014c71d3028937fc797e7a11e419afbd459f2f615" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.537577 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.557629 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558142 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3942760-c6b4-43b5-9680-48d8b8ae3854" containerName="nova-cell1-conductor-db-sync" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558168 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3942760-c6b4-43b5-9680-48d8b8ae3854" containerName="nova-cell1-conductor-db-sync" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558208 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="init" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558216 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="init" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558235 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8726531a-a74e-48cd-a274-6f67ae507560" containerName="nova-manage" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558243 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8726531a-a74e-48cd-a274-6f67ae507560" containerName="nova-manage" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558257 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-log" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558264 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-log" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558279 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-metadata" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558289 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-metadata" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558301 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="dnsmasq-dns" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558308 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="dnsmasq-dns" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.560116 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-metadata" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.560146 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="dnsmasq-dns" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.560159 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3942760-c6b4-43b5-9680-48d8b8ae3854" containerName="nova-cell1-conductor-db-sync" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.560173 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8726531a-a74e-48cd-a274-6f67ae507560" containerName="nova-manage" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.560180 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-log" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.565702 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.570390 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.571869 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.575930 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.576326 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.576731 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.577618 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.579038 4775 scope.go:117] "RemoveContainer" containerID="88473ae1a8fc90fa959a314a4a49d93772825f6cd05e1adb0fc249904b937add" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.595527 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.606948 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.618668 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624499 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7swxn\" (UniqueName: \"kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624561 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624593 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624637 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624770 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb2kw\" (UniqueName: \"kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624998 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.625020 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726630 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7swxn\" (UniqueName: \"kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726707 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726737 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726806 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2kw\" (UniqueName: \"kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726839 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726878 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726894 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.727414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.730701 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.731293 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.732099 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.739534 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.740022 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.742319 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb2kw\" (UniqueName: \"kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.743233 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7swxn\" (UniqueName: \"kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.760583 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" path="/var/lib/kubelet/pods/6a0f23e6-5732-4337-b5fa-d433e99f5cb1/volumes" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.761336 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" path="/var/lib/kubelet/pods/91668934-529e-4df9-b41f-8cd54e5920ea/volumes" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.900739 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.910829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:10 crc kubenswrapper[4775]: I0127 11:40:10.420773 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:10 crc kubenswrapper[4775]: W0127 11:40:10.423878 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7916937d_e997_4d88_8a6b_9fecf57f6828.slice/crio-659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab WatchSource:0}: Error finding container 659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab: Status 404 returned error can't find the container with id 659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab Jan 27 11:40:10 crc kubenswrapper[4775]: I0127 11:40:10.455412 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerStarted","Data":"659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab"} Jan 27 11:40:10 crc kubenswrapper[4775]: I0127 11:40:10.458533 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerName="nova-scheduler-scheduler" containerID="cri-o://c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" gracePeriod=30 Jan 27 11:40:10 crc kubenswrapper[4775]: I0127 11:40:10.486031 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:40:10 crc kubenswrapper[4775]: W0127 11:40:10.493978 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2945fbf_3178_420a_bfaf_d0d9c91d610a.slice/crio-f094ab71659251ae7c395f5253917a11c7fff1315a6946167b9b612c28b6876f WatchSource:0}: Error finding container f094ab71659251ae7c395f5253917a11c7fff1315a6946167b9b612c28b6876f: Status 404 returned error can't find the container with id f094ab71659251ae7c395f5253917a11c7fff1315a6946167b9b612c28b6876f Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.467750 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerStarted","Data":"b680860e2593d7ee3bb455ce65bb0c417d6d9c265106d69c11a3f6d5c337e06f"} Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.467791 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerStarted","Data":"8afc04127ae5dac867cf7f5463a37db08396e7d83dca005132a5f83a2ea9896d"} Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.474376 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f2945fbf-3178-420a-bfaf-d0d9c91d610a","Type":"ContainerStarted","Data":"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a"} Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.474467 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f2945fbf-3178-420a-bfaf-d0d9c91d610a","Type":"ContainerStarted","Data":"f094ab71659251ae7c395f5253917a11c7fff1315a6946167b9b612c28b6876f"} Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.476234 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.500069 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.500052801 podStartE2EDuration="2.500052801s" podCreationTimestamp="2026-01-27 11:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:11.497360806 +0000 UTC m=+1190.638958583" watchObservedRunningTime="2026-01-27 11:40:11.500052801 +0000 UTC m=+1190.641650578" Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.527633 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.5276122130000003 podStartE2EDuration="2.527612213s" podCreationTimestamp="2026-01-27 11:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:11.523836409 +0000 UTC m=+1190.665434206" watchObservedRunningTime="2026-01-27 11:40:11.527612213 +0000 UTC m=+1190.669209980" Jan 27 11:40:12 crc kubenswrapper[4775]: E0127 11:40:12.947298 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:12 crc kubenswrapper[4775]: E0127 11:40:12.948715 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:12 crc kubenswrapper[4775]: E0127 11:40:12.955935 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:12 crc kubenswrapper[4775]: E0127 11:40:12.955976 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerName="nova-scheduler-scheduler" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.490357 4775 generic.go:334] "Generic (PLEG): container finished" podID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerID="c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" exitCode=0 Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.490477 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0c2fced-9c0a-4cef-90ed-d6429ee82751","Type":"ContainerDied","Data":"c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d"} Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.490733 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0c2fced-9c0a-4cef-90ed-d6429ee82751","Type":"ContainerDied","Data":"6cc0b4e0ee3ee3d1c37a3c99f72cc692ec43b7950501d86f96aae8aad30b9516"} Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.490758 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cc0b4e0ee3ee3d1c37a3c99f72cc692ec43b7950501d86f96aae8aad30b9516" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.525072 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.610289 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle\") pod \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.610436 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc8nj\" (UniqueName: \"kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj\") pod \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.610530 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data\") pod \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.616604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj" (OuterVolumeSpecName: "kube-api-access-dc8nj") pod "d0c2fced-9c0a-4cef-90ed-d6429ee82751" (UID: "d0c2fced-9c0a-4cef-90ed-d6429ee82751"). InnerVolumeSpecName "kube-api-access-dc8nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.635760 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data" (OuterVolumeSpecName: "config-data") pod "d0c2fced-9c0a-4cef-90ed-d6429ee82751" (UID: "d0c2fced-9c0a-4cef-90ed-d6429ee82751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.637637 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0c2fced-9c0a-4cef-90ed-d6429ee82751" (UID: "d0c2fced-9c0a-4cef-90ed-d6429ee82751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.712371 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc8nj\" (UniqueName: \"kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.712404 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.712415 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.464788 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.500602 4775 generic.go:334] "Generic (PLEG): container finished" podID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerID="a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d" exitCode=0 Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.500694 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.501646 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.501922 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerDied","Data":"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d"} Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.501963 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerDied","Data":"234a0c458af079345c8244f18b5988ff4056e6dc102c0dcac07740fc8e5eeb55"} Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.501984 4775 scope.go:117] "RemoveContainer" containerID="a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.528951 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.529107 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs\") pod \"d8f6cebd-0ba7-4713-906a-f48b094c332b\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.530736 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs" (OuterVolumeSpecName: "logs") pod "d8f6cebd-0ba7-4713-906a-f48b094c332b" (UID: "d8f6cebd-0ba7-4713-906a-f48b094c332b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.530887 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data\") pod \"d8f6cebd-0ba7-4713-906a-f48b094c332b\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.530975 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle\") pod \"d8f6cebd-0ba7-4713-906a-f48b094c332b\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.531089 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zjnb\" (UniqueName: \"kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb\") pod \"d8f6cebd-0ba7-4713-906a-f48b094c332b\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.533315 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.546730 4775 scope.go:117] "RemoveContainer" containerID="12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.573855 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb" (OuterVolumeSpecName: "kube-api-access-8zjnb") pod "d8f6cebd-0ba7-4713-906a-f48b094c332b" (UID: "d8f6cebd-0ba7-4713-906a-f48b094c332b"). InnerVolumeSpecName "kube-api-access-8zjnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.573996 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data" (OuterVolumeSpecName: "config-data") pod "d8f6cebd-0ba7-4713-906a-f48b094c332b" (UID: "d8f6cebd-0ba7-4713-906a-f48b094c332b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.574037 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.576141 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8f6cebd-0ba7-4713-906a-f48b094c332b" (UID: "d8f6cebd-0ba7-4713-906a-f48b094c332b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.582869 4775 scope.go:117] "RemoveContainer" containerID="a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d" Jan 27 11:40:14 crc kubenswrapper[4775]: E0127 11:40:14.583643 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d\": container with ID starting with a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d not found: ID does not exist" containerID="a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.583685 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d"} err="failed to get container status \"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d\": rpc error: code = NotFound desc = could not find container \"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d\": container with ID starting with a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d not found: ID does not exist" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.583705 4775 scope.go:117] "RemoveContainer" containerID="12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8" Jan 27 11:40:14 crc kubenswrapper[4775]: E0127 11:40:14.583949 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8\": container with ID starting with 12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8 not found: ID does not exist" containerID="12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.583965 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8"} err="failed to get container status \"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8\": rpc error: code = NotFound desc = could not find container \"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8\": container with ID starting with 12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8 not found: ID does not exist" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.601814 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: E0127 11:40:14.602194 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-log" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602213 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-log" Jan 27 11:40:14 crc kubenswrapper[4775]: E0127 11:40:14.602226 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-api" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602232 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-api" Jan 27 11:40:14 crc kubenswrapper[4775]: E0127 11:40:14.602250 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerName="nova-scheduler-scheduler" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602256 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerName="nova-scheduler-scheduler" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602410 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerName="nova-scheduler-scheduler" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602424 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-api" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602438 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-log" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.603064 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.605375 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.613947 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.635121 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.635164 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.635178 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zjnb\" (UniqueName: \"kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.737614 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.737721 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mmbm\" (UniqueName: \"kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.737779 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.838615 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.839958 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.840030 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mmbm\" (UniqueName: \"kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.840083 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.844190 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.847181 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.854410 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.862161 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mmbm\" (UniqueName: \"kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.866862 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.868525 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.870730 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.883849 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.912307 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.912609 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.942113 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzc4q\" (UniqueName: \"kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.942557 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.942583 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.942663 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.970580 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.050784 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.050830 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.050866 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.050954 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzc4q\" (UniqueName: \"kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.051976 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.056403 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.064303 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.075954 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzc4q\" (UniqueName: \"kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.190693 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.383179 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:15 crc kubenswrapper[4775]: W0127 11:40:15.399016 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76fecdf_e253_454b_8e4e_4c9109834188.slice/crio-c155607c7d182e5993c92c25a4eb742bffffa860a5fe816b477ae3783c2ec4bb WatchSource:0}: Error finding container c155607c7d182e5993c92c25a4eb742bffffa860a5fe816b477ae3783c2ec4bb: Status 404 returned error can't find the container with id c155607c7d182e5993c92c25a4eb742bffffa860a5fe816b477ae3783c2ec4bb Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.513948 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b76fecdf-e253-454b-8e4e-4c9109834188","Type":"ContainerStarted","Data":"c155607c7d182e5993c92c25a4eb742bffffa860a5fe816b477ae3783c2ec4bb"} Jan 27 11:40:15 crc kubenswrapper[4775]: W0127 11:40:15.606649 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3bcd5d9_85e4_4754_b39f_17ee05c9991e.slice/crio-e5afc5e5b5021351beb662aa89d06ff2baa448336af2777de4b79215ed4b77f0 WatchSource:0}: Error finding container e5afc5e5b5021351beb662aa89d06ff2baa448336af2777de4b79215ed4b77f0: Status 404 returned error can't find the container with id e5afc5e5b5021351beb662aa89d06ff2baa448336af2777de4b79215ed4b77f0 Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.606974 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.756396 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" path="/var/lib/kubelet/pods/d0c2fced-9c0a-4cef-90ed-d6429ee82751/volumes" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.757105 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" path="/var/lib/kubelet/pods/d8f6cebd-0ba7-4713-906a-f48b094c332b/volumes" Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.523913 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b76fecdf-e253-454b-8e4e-4c9109834188","Type":"ContainerStarted","Data":"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e"} Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.526255 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerStarted","Data":"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a"} Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.526298 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerStarted","Data":"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07"} Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.526312 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerStarted","Data":"e5afc5e5b5021351beb662aa89d06ff2baa448336af2777de4b79215ed4b77f0"} Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.551963 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.551945742 podStartE2EDuration="2.551945742s" podCreationTimestamp="2026-01-27 11:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:16.540531547 +0000 UTC m=+1195.682129324" watchObservedRunningTime="2026-01-27 11:40:16.551945742 +0000 UTC m=+1195.693543519" Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.565564 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.565544998 podStartE2EDuration="2.565544998s" podCreationTimestamp="2026-01-27 11:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:16.559188073 +0000 UTC m=+1195.700785850" watchObservedRunningTime="2026-01-27 11:40:16.565544998 +0000 UTC m=+1195.707142775" Jan 27 11:40:19 crc kubenswrapper[4775]: I0127 11:40:19.911995 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:40:19 crc kubenswrapper[4775]: I0127 11:40:19.912377 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:40:19 crc kubenswrapper[4775]: I0127 11:40:19.935956 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:19 crc kubenswrapper[4775]: I0127 11:40:19.970719 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 11:40:20 crc kubenswrapper[4775]: I0127 11:40:20.511335 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 27 11:40:20 crc kubenswrapper[4775]: I0127 11:40:20.923598 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:20 crc kubenswrapper[4775]: I0127 11:40:20.923626 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:24 crc kubenswrapper[4775]: I0127 11:40:24.971369 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.017701 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.111163 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.111697 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d650e06f-8d9a-443d-9045-82cef3c36ad3" containerName="kube-state-metrics" containerID="cri-o://41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888" gracePeriod=30 Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.193088 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.193159 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.593952 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.634992 4775 generic.go:334] "Generic (PLEG): container finished" podID="d650e06f-8d9a-443d-9045-82cef3c36ad3" containerID="41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888" exitCode=2 Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.635895 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.636305 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d650e06f-8d9a-443d-9045-82cef3c36ad3","Type":"ContainerDied","Data":"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888"} Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.636337 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d650e06f-8d9a-443d-9045-82cef3c36ad3","Type":"ContainerDied","Data":"8c517699b915acc52e0019dc1c45d2e9a3ea6904e06f7498f332512ca9be5304"} Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.636367 4775 scope.go:117] "RemoveContainer" containerID="41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.657891 4775 scope.go:117] "RemoveContainer" containerID="41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888" Jan 27 11:40:25 crc kubenswrapper[4775]: E0127 11:40:25.658271 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888\": container with ID starting with 41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888 not found: ID does not exist" containerID="41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.658311 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888"} err="failed to get container status \"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888\": rpc error: code = NotFound desc = could not find container \"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888\": container with ID starting with 41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888 not found: ID does not exist" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.659795 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzdzx\" (UniqueName: \"kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx\") pod \"d650e06f-8d9a-443d-9045-82cef3c36ad3\" (UID: \"d650e06f-8d9a-443d-9045-82cef3c36ad3\") " Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.668141 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.668167 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx" (OuterVolumeSpecName: "kube-api-access-zzdzx") pod "d650e06f-8d9a-443d-9045-82cef3c36ad3" (UID: "d650e06f-8d9a-443d-9045-82cef3c36ad3"). InnerVolumeSpecName "kube-api-access-zzdzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.762223 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzdzx\" (UniqueName: \"kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.964147 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.974717 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.993354 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:25 crc kubenswrapper[4775]: E0127 11:40:25.993755 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d650e06f-8d9a-443d-9045-82cef3c36ad3" containerName="kube-state-metrics" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.993772 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d650e06f-8d9a-443d-9045-82cef3c36ad3" containerName="kube-state-metrics" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.993937 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d650e06f-8d9a-443d-9045-82cef3c36ad3" containerName="kube-state-metrics" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.994521 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.000535 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.001123 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.016831 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.069585 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.069657 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.069750 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8vm4\" (UniqueName: \"kubernetes.io/projected/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-api-access-h8vm4\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.069777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.171696 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.171763 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.171831 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8vm4\" (UniqueName: \"kubernetes.io/projected/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-api-access-h8vm4\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.171856 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.175512 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.177010 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.185412 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.192338 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8vm4\" (UniqueName: \"kubernetes.io/projected/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-api-access-h8vm4\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.276615 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.276622 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.312887 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.823371 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.187188 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.193353 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-central-agent" containerID="cri-o://3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2" gracePeriod=30 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.193833 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="proxy-httpd" containerID="cri-o://eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261" gracePeriod=30 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.193907 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="sg-core" containerID="cri-o://8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d" gracePeriod=30 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.193946 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-notification-agent" containerID="cri-o://43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005" gracePeriod=30 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658077 4775 generic.go:334] "Generic (PLEG): container finished" podID="877bcef1-579c-413c-a0c0-6dad63885091" containerID="eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261" exitCode=0 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658405 4775 generic.go:334] "Generic (PLEG): container finished" podID="877bcef1-579c-413c-a0c0-6dad63885091" containerID="8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d" exitCode=2 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658416 4775 generic.go:334] "Generic (PLEG): container finished" podID="877bcef1-579c-413c-a0c0-6dad63885091" containerID="3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2" exitCode=0 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658146 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerDied","Data":"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261"} Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658500 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerDied","Data":"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d"} Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658514 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerDied","Data":"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2"} Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.660375 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7aa68248-0707-4f5c-8689-57cf6d07c250","Type":"ContainerStarted","Data":"00a80a967ddb6eab2e4de0d664cc76e1e667eaa84ad42ded92f09ec9ae23383c"} Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.660406 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7aa68248-0707-4f5c-8689-57cf6d07c250","Type":"ContainerStarted","Data":"4e5b8f5a86f4b65ac4cff674b3b701010564e4e4d052eb787ecc2de495fba9f1"} Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.660497 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.674896 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.325264385 podStartE2EDuration="2.674879045s" podCreationTimestamp="2026-01-27 11:40:25 +0000 UTC" firstStartedPulling="2026-01-27 11:40:26.827516833 +0000 UTC m=+1205.969114600" lastFinishedPulling="2026-01-27 11:40:27.177131483 +0000 UTC m=+1206.318729260" observedRunningTime="2026-01-27 11:40:27.672795637 +0000 UTC m=+1206.814393434" watchObservedRunningTime="2026-01-27 11:40:27.674879045 +0000 UTC m=+1206.816476822" Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.755576 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d650e06f-8d9a-443d-9045-82cef3c36ad3" path="/var/lib/kubelet/pods/d650e06f-8d9a-443d-9045-82cef3c36ad3/volumes" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.304353 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434671 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434783 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44vhv\" (UniqueName: \"kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434830 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434880 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434937 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434979 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.435031 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.435467 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.435500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.440049 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv" (OuterVolumeSpecName: "kube-api-access-44vhv") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "kube-api-access-44vhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.446569 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts" (OuterVolumeSpecName: "scripts") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.471253 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.505591 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537434 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537480 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537507 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537516 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537524 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44vhv\" (UniqueName: \"kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537532 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.543663 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data" (OuterVolumeSpecName: "config-data") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.638927 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.684443 4775 generic.go:334] "Generic (PLEG): container finished" podID="877bcef1-579c-413c-a0c0-6dad63885091" containerID="43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005" exitCode=0 Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.684490 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerDied","Data":"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005"} Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.684530 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerDied","Data":"eeb5d6eb3865672e5d710d66ff273bcee9e0b5353cef376cf3d7740ea7501229"} Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.684548 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.684565 4775 scope.go:117] "RemoveContainer" containerID="eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.711799 4775 scope.go:117] "RemoveContainer" containerID="8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.721505 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.730214 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.741034 4775 scope.go:117] "RemoveContainer" containerID="43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.756674 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877bcef1-579c-413c-a0c0-6dad63885091" path="/var/lib/kubelet/pods/877bcef1-579c-413c-a0c0-6dad63885091/volumes" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.757441 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.757800 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-notification-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.757818 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-notification-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.757837 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-central-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.757845 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-central-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.757857 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="proxy-httpd" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.757864 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="proxy-httpd" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.757892 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="sg-core" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.757901 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="sg-core" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.758082 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-central-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.758109 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="sg-core" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.758121 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-notification-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.758130 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="proxy-httpd" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.766978 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.768682 4775 scope.go:117] "RemoveContainer" containerID="3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.770815 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.770965 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.780975 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.796507 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.806915 4775 scope.go:117] "RemoveContainer" containerID="eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.807421 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261\": container with ID starting with eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261 not found: ID does not exist" containerID="eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.807479 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261"} err="failed to get container status \"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261\": rpc error: code = NotFound desc = could not find container \"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261\": container with ID starting with eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261 not found: ID does not exist" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.807508 4775 scope.go:117] "RemoveContainer" containerID="8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.810887 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d\": container with ID starting with 8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d not found: ID does not exist" containerID="8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.810943 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d"} err="failed to get container status \"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d\": rpc error: code = NotFound desc = could not find container \"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d\": container with ID starting with 8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d not found: ID does not exist" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.810972 4775 scope.go:117] "RemoveContainer" containerID="43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.811265 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005\": container with ID starting with 43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005 not found: ID does not exist" containerID="43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.811295 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005"} err="failed to get container status \"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005\": rpc error: code = NotFound desc = could not find container \"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005\": container with ID starting with 43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005 not found: ID does not exist" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.811315 4775 scope.go:117] "RemoveContainer" containerID="3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.811650 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2\": container with ID starting with 3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2 not found: ID does not exist" containerID="3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.811672 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2"} err="failed to get container status \"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2\": rpc error: code = NotFound desc = could not find container \"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2\": container with ID starting with 3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2 not found: ID does not exist" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842047 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842125 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842147 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842261 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnhfs\" (UniqueName: \"kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842301 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842345 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.917104 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.919089 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.924392 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943571 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943636 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnhfs\" (UniqueName: \"kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943701 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943763 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943824 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943859 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943889 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943949 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.944071 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.944402 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.947902 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.948330 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.948569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.949392 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.950048 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.967390 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnhfs\" (UniqueName: \"kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:30 crc kubenswrapper[4775]: I0127 11:40:30.093653 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:30 crc kubenswrapper[4775]: W0127 11:40:30.597012 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e2cdf2_0f89_471a_b0b4_a6437dfc7428.slice/crio-2720de5b9b26ed69e8ff5d7378ac05437110fdc5c5ce3bf22870368966b52b2b WatchSource:0}: Error finding container 2720de5b9b26ed69e8ff5d7378ac05437110fdc5c5ce3bf22870368966b52b2b: Status 404 returned error can't find the container with id 2720de5b9b26ed69e8ff5d7378ac05437110fdc5c5ce3bf22870368966b52b2b Jan 27 11:40:30 crc kubenswrapper[4775]: I0127 11:40:30.604988 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:30 crc kubenswrapper[4775]: I0127 11:40:30.693721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerStarted","Data":"2720de5b9b26ed69e8ff5d7378ac05437110fdc5c5ce3bf22870368966b52b2b"} Jan 27 11:40:30 crc kubenswrapper[4775]: I0127 11:40:30.701550 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:31.703816 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerStarted","Data":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.712596 4775 generic.go:334] "Generic (PLEG): container finished" podID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" containerID="5f66195a27d4424e7e63c73f2e82e91d3646c082443a037a0bda03b3cefa73cf" exitCode=137 Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.712681 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8","Type":"ContainerDied","Data":"5f66195a27d4424e7e63c73f2e82e91d3646c082443a037a0bda03b3cefa73cf"} Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.712857 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8","Type":"ContainerDied","Data":"ed92b0535d28c0e558eeedf3ab4bfde4b43bbb5e6bbdcef58e08c4e58984f177"} Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.712870 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed92b0535d28c0e558eeedf3ab4bfde4b43bbb5e6bbdcef58e08c4e58984f177" Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.760067 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.908225 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data\") pod \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.908352 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmdhc\" (UniqueName: \"kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc\") pod \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.908474 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle\") pod \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.914337 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc" (OuterVolumeSpecName: "kube-api-access-nmdhc") pod "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" (UID: "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8"). InnerVolumeSpecName "kube-api-access-nmdhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.935165 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" (UID: "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.936321 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data" (OuterVolumeSpecName: "config-data") pod "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" (UID: "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.011899 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.011959 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmdhc\" (UniqueName: \"kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.011982 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.725560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerStarted","Data":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.725609 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.770398 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.785488 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.796414 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:33 crc kubenswrapper[4775]: E0127 11:40:33.797861 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.797897 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.798136 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.798881 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.807498 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.809666 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.810080 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.810505 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.928132 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cv7d\" (UniqueName: \"kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.928173 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.928206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.928329 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.928420 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.029534 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.029637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cv7d\" (UniqueName: \"kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.029693 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.029728 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.029768 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.034414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.036015 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.039888 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.044918 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.045392 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cv7d\" (UniqueName: \"kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.134036 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.588956 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:34 crc kubenswrapper[4775]: W0127 11:40:34.594758 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6628a06a_9e13_4402_94d9_1df5c42e3c7a.slice/crio-fafac5d47be64962872bf10acf0347810c872eb880366bed3a34d442b4601ca2 WatchSource:0}: Error finding container fafac5d47be64962872bf10acf0347810c872eb880366bed3a34d442b4601ca2: Status 404 returned error can't find the container with id fafac5d47be64962872bf10acf0347810c872eb880366bed3a34d442b4601ca2 Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.735402 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6628a06a-9e13-4402-94d9-1df5c42e3c7a","Type":"ContainerStarted","Data":"fafac5d47be64962872bf10acf0347810c872eb880366bed3a34d442b4601ca2"} Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.194693 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.195208 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.199556 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.202069 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.754375 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" path="/var/lib/kubelet/pods/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8/volumes" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.755051 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6628a06a-9e13-4402-94d9-1df5c42e3c7a","Type":"ContainerStarted","Data":"6113d21a389d355a543441146f4850a74a219c9b51229d74f630ef6722366592"} Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.755094 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.755137 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.755148 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerStarted","Data":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.776336 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.776312737 podStartE2EDuration="2.776312737s" podCreationTimestamp="2026-01-27 11:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:35.766212148 +0000 UTC m=+1214.907809915" watchObservedRunningTime="2026-01-27 11:40:35.776312737 +0000 UTC m=+1214.917910514" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.943191 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.945310 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.979083 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068351 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068403 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068856 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068925 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068975 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brv5f\" (UniqueName: \"kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068995 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170138 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170235 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170265 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170286 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brv5f\" (UniqueName: \"kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170310 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.171179 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.171190 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.171248 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.171844 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.172027 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.191063 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brv5f\" (UniqueName: \"kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.311405 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.360797 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.821083 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:40:37 crc kubenswrapper[4775]: I0127 11:40:37.770843 4775 generic.go:334] "Generic (PLEG): container finished" podID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerID="86a1bff7b31394585d429293e2cf406a868ddfdf2d92e362c2ef607e10a9665a" exitCode=0 Jan 27 11:40:37 crc kubenswrapper[4775]: I0127 11:40:37.773614 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" event={"ID":"160a0f00-a19e-4522-b8ea-2a14f87906e9","Type":"ContainerDied","Data":"86a1bff7b31394585d429293e2cf406a868ddfdf2d92e362c2ef607e10a9665a"} Jan 27 11:40:37 crc kubenswrapper[4775]: I0127 11:40:37.773660 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" event={"ID":"160a0f00-a19e-4522-b8ea-2a14f87906e9","Type":"ContainerStarted","Data":"a34cf5c231353408ee47634ef10ee450bdbb3cc3b1d50b38665b4fa21e3b0692"} Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.312029 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.417944 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.780646 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" event={"ID":"160a0f00-a19e-4522-b8ea-2a14f87906e9","Type":"ContainerStarted","Data":"cceb38c9f507e6c4fd34c4cca53a771be807a04a895235a4301c6341b1fac77c"} Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.781803 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.791537 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-log" containerID="cri-o://e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07" gracePeriod=30 Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.792278 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerStarted","Data":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.792310 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.792358 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-api" containerID="cri-o://3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a" gracePeriod=30 Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.814845 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" podStartSLOduration=3.814831951 podStartE2EDuration="3.814831951s" podCreationTimestamp="2026-01-27 11:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:38.802637923 +0000 UTC m=+1217.944235700" watchObservedRunningTime="2026-01-27 11:40:38.814831951 +0000 UTC m=+1217.956429728" Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.835008 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.811908766 podStartE2EDuration="9.834987778s" podCreationTimestamp="2026-01-27 11:40:29 +0000 UTC" firstStartedPulling="2026-01-27 11:40:30.599363722 +0000 UTC m=+1209.740961499" lastFinishedPulling="2026-01-27 11:40:37.622442734 +0000 UTC m=+1216.764040511" observedRunningTime="2026-01-27 11:40:38.832080078 +0000 UTC m=+1217.973677855" watchObservedRunningTime="2026-01-27 11:40:38.834987778 +0000 UTC m=+1217.976585555" Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.134152 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.801364 4775 generic.go:334] "Generic (PLEG): container finished" podID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerID="e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07" exitCode=143 Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.801460 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerDied","Data":"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07"} Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.802016 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-central-agent" containerID="cri-o://092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" gracePeriod=30 Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.802039 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="proxy-httpd" containerID="cri-o://d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" gracePeriod=30 Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.802107 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="sg-core" containerID="cri-o://4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" gracePeriod=30 Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.802138 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-notification-agent" containerID="cri-o://3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" gracePeriod=30 Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.619913 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.765305 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.765697 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.765761 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.765913 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766001 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766058 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766085 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766233 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnhfs\" (UniqueName: \"kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766270 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766669 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.767072 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.767093 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.771910 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs" (OuterVolumeSpecName: "kube-api-access-dnhfs") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "kube-api-access-dnhfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.772218 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts" (OuterVolumeSpecName: "scripts") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.800293 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.823117 4775 generic.go:334] "Generic (PLEG): container finished" podID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" exitCode=0 Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.823158 4775 generic.go:334] "Generic (PLEG): container finished" podID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" exitCode=2 Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.823169 4775 generic.go:334] "Generic (PLEG): container finished" podID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" exitCode=0 Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.823177 4775 generic.go:334] "Generic (PLEG): container finished" podID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" exitCode=0 Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.824488 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825035 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerDied","Data":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825072 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerDied","Data":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825086 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerDied","Data":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825097 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerDied","Data":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825111 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerDied","Data":"2720de5b9b26ed69e8ff5d7378ac05437110fdc5c5ce3bf22870368966b52b2b"} Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825128 4775 scope.go:117] "RemoveContainer" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.828661 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.843402 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.850627 4775 scope.go:117] "RemoveContainer" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.875520 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.875552 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.875562 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnhfs\" (UniqueName: \"kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.875573 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.875582 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.877750 4775 scope.go:117] "RemoveContainer" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.885107 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data" (OuterVolumeSpecName: "config-data") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.900814 4775 scope.go:117] "RemoveContainer" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.918724 4775 scope.go:117] "RemoveContainer" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: E0127 11:40:40.919131 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": container with ID starting with d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1 not found: ID does not exist" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919183 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} err="failed to get container status \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": rpc error: code = NotFound desc = could not find container \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": container with ID starting with d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919203 4775 scope.go:117] "RemoveContainer" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: E0127 11:40:40.919429 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": container with ID starting with 4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4 not found: ID does not exist" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919482 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} err="failed to get container status \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": rpc error: code = NotFound desc = could not find container \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": container with ID starting with 4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919496 4775 scope.go:117] "RemoveContainer" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: E0127 11:40:40.919680 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": container with ID starting with 3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b not found: ID does not exist" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919699 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} err="failed to get container status \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": rpc error: code = NotFound desc = could not find container \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": container with ID starting with 3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919712 4775 scope.go:117] "RemoveContainer" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: E0127 11:40:40.919840 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": container with ID starting with 092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d not found: ID does not exist" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919854 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} err="failed to get container status \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": rpc error: code = NotFound desc = could not find container \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": container with ID starting with 092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919866 4775 scope.go:117] "RemoveContainer" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919998 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} err="failed to get container status \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": rpc error: code = NotFound desc = could not find container \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": container with ID starting with d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920012 4775 scope.go:117] "RemoveContainer" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920189 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} err="failed to get container status \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": rpc error: code = NotFound desc = could not find container \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": container with ID starting with 4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920204 4775 scope.go:117] "RemoveContainer" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920358 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} err="failed to get container status \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": rpc error: code = NotFound desc = could not find container \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": container with ID starting with 3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920377 4775 scope.go:117] "RemoveContainer" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920823 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} err="failed to get container status \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": rpc error: code = NotFound desc = could not find container \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": container with ID starting with 092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920856 4775 scope.go:117] "RemoveContainer" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.921106 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} err="failed to get container status \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": rpc error: code = NotFound desc = could not find container \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": container with ID starting with d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.921127 4775 scope.go:117] "RemoveContainer" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922160 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} err="failed to get container status \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": rpc error: code = NotFound desc = could not find container \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": container with ID starting with 4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922193 4775 scope.go:117] "RemoveContainer" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922432 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} err="failed to get container status \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": rpc error: code = NotFound desc = could not find container \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": container with ID starting with 3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922492 4775 scope.go:117] "RemoveContainer" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922698 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} err="failed to get container status \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": rpc error: code = NotFound desc = could not find container \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": container with ID starting with 092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922719 4775 scope.go:117] "RemoveContainer" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923006 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} err="failed to get container status \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": rpc error: code = NotFound desc = could not find container \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": container with ID starting with d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923031 4775 scope.go:117] "RemoveContainer" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923236 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} err="failed to get container status \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": rpc error: code = NotFound desc = could not find container \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": container with ID starting with 4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923255 4775 scope.go:117] "RemoveContainer" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923413 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} err="failed to get container status \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": rpc error: code = NotFound desc = could not find container \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": container with ID starting with 3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923433 4775 scope.go:117] "RemoveContainer" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.924338 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} err="failed to get container status \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": rpc error: code = NotFound desc = could not find container \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": container with ID starting with 092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.977992 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.160533 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.171775 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.187529 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:41 crc kubenswrapper[4775]: E0127 11:40:41.188248 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="sg-core" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.188373 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="sg-core" Jan 27 11:40:41 crc kubenswrapper[4775]: E0127 11:40:41.188479 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-central-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.188552 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-central-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: E0127 11:40:41.188653 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-notification-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.188754 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-notification-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: E0127 11:40:41.188831 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="proxy-httpd" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.188902 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="proxy-httpd" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.189176 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="proxy-httpd" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.189267 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-central-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.189354 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="sg-core" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.189428 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-notification-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.191694 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.194333 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.194709 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.194836 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.196909 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288509 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288556 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-log-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288589 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r2z2\" (UniqueName: \"kubernetes.io/projected/f0fb6dfd-0694-418a-965e-789707762ef7-kube-api-access-5r2z2\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288620 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288661 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-scripts\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288682 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288734 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-config-data\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288753 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-run-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390426 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-scripts\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-config-data\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390826 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-run-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390939 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390976 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-log-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.391034 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r2z2\" (UniqueName: \"kubernetes.io/projected/f0fb6dfd-0694-418a-965e-789707762ef7-kube-api-access-5r2z2\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.391628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.392302 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-run-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.392323 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-log-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.400414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-scripts\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.400825 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.401345 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-config-data\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.406875 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.407596 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.414837 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r2z2\" (UniqueName: \"kubernetes.io/projected/f0fb6dfd-0694-418a-965e-789707762ef7-kube-api-access-5r2z2\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.512657 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.763305 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" path="/var/lib/kubelet/pods/18e2cdf2-0f89-471a-b0b4-a6437dfc7428/volumes" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.960661 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:41 crc kubenswrapper[4775]: W0127 11:40:41.966876 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0fb6dfd_0694_418a_965e_789707762ef7.slice/crio-00fa37c4b19ad035bf15e2caeaf7dca2a35d6e525feaeaad8a2378fac6e0e19e WatchSource:0}: Error finding container 00fa37c4b19ad035bf15e2caeaf7dca2a35d6e525feaeaad8a2378fac6e0e19e: Status 404 returned error can't find the container with id 00fa37c4b19ad035bf15e2caeaf7dca2a35d6e525feaeaad8a2378fac6e0e19e Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.969437 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.425250 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.515687 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzc4q\" (UniqueName: \"kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q\") pod \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.515757 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data\") pod \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.515965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle\") pod \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.516111 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs\") pod \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.517259 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs" (OuterVolumeSpecName: "logs") pod "d3bcd5d9-85e4-4754-b39f-17ee05c9991e" (UID: "d3bcd5d9-85e4-4754-b39f-17ee05c9991e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.523581 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q" (OuterVolumeSpecName: "kube-api-access-zzc4q") pod "d3bcd5d9-85e4-4754-b39f-17ee05c9991e" (UID: "d3bcd5d9-85e4-4754-b39f-17ee05c9991e"). InnerVolumeSpecName "kube-api-access-zzc4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.562817 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3bcd5d9-85e4-4754-b39f-17ee05c9991e" (UID: "d3bcd5d9-85e4-4754-b39f-17ee05c9991e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.568073 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data" (OuterVolumeSpecName: "config-data") pod "d3bcd5d9-85e4-4754-b39f-17ee05c9991e" (UID: "d3bcd5d9-85e4-4754-b39f-17ee05c9991e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.618367 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.618435 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.618478 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzc4q\" (UniqueName: \"kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.618491 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.843415 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"30428215fd25f2d293050de6aefc5e00ce0f54513b74c8c39065ab59e8f5dfd5"} Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.843471 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"00fa37c4b19ad035bf15e2caeaf7dca2a35d6e525feaeaad8a2378fac6e0e19e"} Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.845153 4775 generic.go:334] "Generic (PLEG): container finished" podID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerID="3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a" exitCode=0 Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.845184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerDied","Data":"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a"} Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.845203 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerDied","Data":"e5afc5e5b5021351beb662aa89d06ff2baa448336af2777de4b79215ed4b77f0"} Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.845213 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.845223 4775 scope.go:117] "RemoveContainer" containerID="3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.869706 4775 scope.go:117] "RemoveContainer" containerID="e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.883082 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.899256 4775 scope.go:117] "RemoveContainer" containerID="3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a" Jan 27 11:40:42 crc kubenswrapper[4775]: E0127 11:40:42.900409 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a\": container with ID starting with 3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a not found: ID does not exist" containerID="3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.900443 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a"} err="failed to get container status \"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a\": rpc error: code = NotFound desc = could not find container \"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a\": container with ID starting with 3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a not found: ID does not exist" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.900496 4775 scope.go:117] "RemoveContainer" containerID="e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07" Jan 27 11:40:42 crc kubenswrapper[4775]: E0127 11:40:42.900778 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07\": container with ID starting with e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07 not found: ID does not exist" containerID="e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.900816 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07"} err="failed to get container status \"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07\": rpc error: code = NotFound desc = could not find container \"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07\": container with ID starting with e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07 not found: ID does not exist" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.907077 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.920849 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:42 crc kubenswrapper[4775]: E0127 11:40:42.921243 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-api" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.921260 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-api" Jan 27 11:40:42 crc kubenswrapper[4775]: E0127 11:40:42.921273 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-log" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.921280 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-log" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.921479 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-log" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.921496 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-api" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.922373 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.928067 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.928282 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.928429 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.933061 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025334 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025482 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zsf\" (UniqueName: \"kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025556 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025572 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127370 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127426 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127492 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zsf\" (UniqueName: \"kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127543 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127567 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127581 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.128748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.135926 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.136021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.139237 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.140079 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.145190 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zsf\" (UniqueName: \"kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.258653 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.789769 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" path="/var/lib/kubelet/pods/d3bcd5d9-85e4-4754-b39f-17ee05c9991e/volumes" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.795392 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.854465 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerStarted","Data":"e9b280e81c9b3e2f27e2a003b985657f4d413b9e18eca2cc53bed8cbd3cdcb27"} Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.856248 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"aea3181cf116bae455f41b1366597b119efc1371f74ffae26f9a4168156cbb13"} Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.134734 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.150482 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.868618 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"3ce959ac65992ddc8dbe0e5dc438cf5975b410ccc59cc1574279bcf41dba9159"} Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.870966 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerStarted","Data":"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80"} Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.871019 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerStarted","Data":"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588"} Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.901381 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.901362281 podStartE2EDuration="2.901362281s" podCreationTimestamp="2026-01-27 11:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:44.892689531 +0000 UTC m=+1224.034287318" watchObservedRunningTime="2026-01-27 11:40:44.901362281 +0000 UTC m=+1224.042960058" Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.934623 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.133679 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4lnkz"] Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.134712 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.137052 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.137075 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.147605 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4lnkz"] Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.263344 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8f8l\" (UniqueName: \"kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.263414 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.263698 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.263782 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.366179 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8f8l\" (UniqueName: \"kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.366261 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.366362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.366389 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.372097 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.376155 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.385715 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.387131 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8f8l\" (UniqueName: \"kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.449988 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.881388 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"6a28a6bfae3dfae0c75190ed63d0170da7178a0b262c8aee08135af71c93d6d7"} Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.906800 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.5055357520000001 podStartE2EDuration="4.906768279s" podCreationTimestamp="2026-01-27 11:40:41 +0000 UTC" firstStartedPulling="2026-01-27 11:40:41.969192342 +0000 UTC m=+1221.110790119" lastFinishedPulling="2026-01-27 11:40:45.370424869 +0000 UTC m=+1224.512022646" observedRunningTime="2026-01-27 11:40:45.906522283 +0000 UTC m=+1225.048120070" watchObservedRunningTime="2026-01-27 11:40:45.906768279 +0000 UTC m=+1225.048366056" Jan 27 11:40:45 crc kubenswrapper[4775]: W0127 11:40:45.955581 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb77cbe7c_5901_44d2_959f_5435b8adbc85.slice/crio-a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3 WatchSource:0}: Error finding container a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3: Status 404 returned error can't find the container with id a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3 Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.956622 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4lnkz"] Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.313760 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.391770 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.392110 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="dnsmasq-dns" containerID="cri-o://ada66549c4f1e296080bb921b685b5ff52027670033c232a5715f71a31d45760" gracePeriod=10 Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.905114 4775 generic.go:334] "Generic (PLEG): container finished" podID="e869df3d-c15b-4610-bb78-00ad49940d17" containerID="ada66549c4f1e296080bb921b685b5ff52027670033c232a5715f71a31d45760" exitCode=0 Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.905486 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerDied","Data":"ada66549c4f1e296080bb921b685b5ff52027670033c232a5715f71a31d45760"} Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.905517 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerDied","Data":"52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e"} Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.905528 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e" Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.908468 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4lnkz" event={"ID":"b77cbe7c-5901-44d2-959f-5435b8adbc85","Type":"ContainerStarted","Data":"fee7236fa11e516e48176ea4ac10ecf99f92b8a3df878c241be649e46d2bcbab"} Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.908520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4lnkz" event={"ID":"b77cbe7c-5901-44d2-959f-5435b8adbc85","Type":"ContainerStarted","Data":"a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3"} Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.908766 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.925761 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4lnkz" podStartSLOduration=1.9257450839999999 podStartE2EDuration="1.925745084s" podCreationTimestamp="2026-01-27 11:40:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:46.924956562 +0000 UTC m=+1226.066554339" watchObservedRunningTime="2026-01-27 11:40:46.925745084 +0000 UTC m=+1226.067342861" Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.928329 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.024770 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw444\" (UniqueName: \"kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.024850 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.024884 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.024945 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.025021 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.025116 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.036962 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444" (OuterVolumeSpecName: "kube-api-access-qw444") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "kube-api-access-qw444". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.079105 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.083619 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.084703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.102065 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config" (OuterVolumeSpecName: "config") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.127035 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.127073 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.127083 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.127094 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw444\" (UniqueName: \"kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.127104 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.130352 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.228809 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.915413 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.935138 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.946183 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:40:49 crc kubenswrapper[4775]: I0127 11:40:49.758376 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" path="/var/lib/kubelet/pods/e869df3d-c15b-4610-bb78-00ad49940d17/volumes" Jan 27 11:40:50 crc kubenswrapper[4775]: I0127 11:40:50.944068 4775 generic.go:334] "Generic (PLEG): container finished" podID="b77cbe7c-5901-44d2-959f-5435b8adbc85" containerID="fee7236fa11e516e48176ea4ac10ecf99f92b8a3df878c241be649e46d2bcbab" exitCode=0 Jan 27 11:40:50 crc kubenswrapper[4775]: I0127 11:40:50.944184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4lnkz" event={"ID":"b77cbe7c-5901-44d2-959f-5435b8adbc85","Type":"ContainerDied","Data":"fee7236fa11e516e48176ea4ac10ecf99f92b8a3df878c241be649e46d2bcbab"} Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.496548 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.572024 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data\") pod \"b77cbe7c-5901-44d2-959f-5435b8adbc85\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.572428 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle\") pod \"b77cbe7c-5901-44d2-959f-5435b8adbc85\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.572661 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts\") pod \"b77cbe7c-5901-44d2-959f-5435b8adbc85\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.572719 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8f8l\" (UniqueName: \"kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l\") pod \"b77cbe7c-5901-44d2-959f-5435b8adbc85\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.577702 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l" (OuterVolumeSpecName: "kube-api-access-b8f8l") pod "b77cbe7c-5901-44d2-959f-5435b8adbc85" (UID: "b77cbe7c-5901-44d2-959f-5435b8adbc85"). InnerVolumeSpecName "kube-api-access-b8f8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.588635 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts" (OuterVolumeSpecName: "scripts") pod "b77cbe7c-5901-44d2-959f-5435b8adbc85" (UID: "b77cbe7c-5901-44d2-959f-5435b8adbc85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.605018 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b77cbe7c-5901-44d2-959f-5435b8adbc85" (UID: "b77cbe7c-5901-44d2-959f-5435b8adbc85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.617484 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data" (OuterVolumeSpecName: "config-data") pod "b77cbe7c-5901-44d2-959f-5435b8adbc85" (UID: "b77cbe7c-5901-44d2-959f-5435b8adbc85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.673826 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.673859 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8f8l\" (UniqueName: \"kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.673871 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.673880 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.968508 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4lnkz" event={"ID":"b77cbe7c-5901-44d2-959f-5435b8adbc85","Type":"ContainerDied","Data":"a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3"} Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.968542 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.968591 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.185313 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.185864 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-log" containerID="cri-o://cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" gracePeriod=30 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.185995 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-api" containerID="cri-o://bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" gracePeriod=30 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.207088 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.207616 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" containerName="nova-scheduler-scheduler" containerID="cri-o://5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" gracePeriod=30 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.219496 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.219725 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" containerID="cri-o://8afc04127ae5dac867cf7f5463a37db08396e7d83dca005132a5f83a2ea9896d" gracePeriod=30 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.219844 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" containerID="cri-o://b680860e2593d7ee3bb455ce65bb0c417d6d9c265106d69c11a3f6d5c337e06f" gracePeriod=30 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.810426 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977006 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerID="bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" exitCode=0 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977035 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerID="cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" exitCode=143 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977051 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerDied","Data":"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80"} Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977036 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977078 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerDied","Data":"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588"} Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977102 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerDied","Data":"e9b280e81c9b3e2f27e2a003b985657f4d413b9e18eca2cc53bed8cbd3cdcb27"} Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977116 4775 scope.go:117] "RemoveContainer" containerID="bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.984299 4775 generic.go:334] "Generic (PLEG): container finished" podID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerID="8afc04127ae5dac867cf7f5463a37db08396e7d83dca005132a5f83a2ea9896d" exitCode=143 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.984329 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerDied","Data":"8afc04127ae5dac867cf7f5463a37db08396e7d83dca005132a5f83a2ea9896d"} Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.997288 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:53.999994 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.000072 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.000140 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.000211 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9zsf\" (UniqueName: \"kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.000252 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.000623 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs" (OuterVolumeSpecName: "logs") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.001392 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.005703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf" (OuterVolumeSpecName: "kube-api-access-z9zsf") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "kube-api-access-z9zsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.009674 4775 scope.go:117] "RemoveContainer" containerID="cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.030419 4775 scope.go:117] "RemoveContainer" containerID="bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.032418 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80\": container with ID starting with bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80 not found: ID does not exist" containerID="bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.032474 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80"} err="failed to get container status \"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80\": rpc error: code = NotFound desc = could not find container \"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80\": container with ID starting with bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80 not found: ID does not exist" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.032497 4775 scope.go:117] "RemoveContainer" containerID="cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.033250 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.033277 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data" (OuterVolumeSpecName: "config-data") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.034707 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588\": container with ID starting with cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588 not found: ID does not exist" containerID="cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.034748 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588"} err="failed to get container status \"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588\": rpc error: code = NotFound desc = could not find container \"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588\": container with ID starting with cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588 not found: ID does not exist" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.034765 4775 scope.go:117] "RemoveContainer" containerID="bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.035078 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80"} err="failed to get container status \"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80\": rpc error: code = NotFound desc = could not find container \"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80\": container with ID starting with bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80 not found: ID does not exist" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.035096 4775 scope.go:117] "RemoveContainer" containerID="cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.035487 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588"} err="failed to get container status \"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588\": rpc error: code = NotFound desc = could not find container \"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588\": container with ID starting with cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588 not found: ID does not exist" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.061920 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.061946 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.102580 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9zsf\" (UniqueName: \"kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.102626 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.102635 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.102644 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.102653 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.332832 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.355963 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375210 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.375644 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="init" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375662 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="init" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.375676 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-log" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375684 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-log" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.375710 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="dnsmasq-dns" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375717 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="dnsmasq-dns" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.375735 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-api" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375740 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-api" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.375750 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77cbe7c-5901-44d2-959f-5435b8adbc85" containerName="nova-manage" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375756 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77cbe7c-5901-44d2-959f-5435b8adbc85" containerName="nova-manage" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375905 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-api" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375918 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-log" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375932 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="dnsmasq-dns" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375949 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77cbe7c-5901-44d2-959f-5435b8adbc85" containerName="nova-manage" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.376971 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.384418 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.387962 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.388225 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.389232 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407284 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407375 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407434 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48lz2\" (UniqueName: \"kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407597 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407668 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.509303 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.510365 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.510403 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48lz2\" (UniqueName: \"kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.510468 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.510510 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.510547 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.511716 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.513987 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.515587 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.516490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.517008 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.538147 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48lz2\" (UniqueName: \"kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.711353 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.973681 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.975193 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.976580 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.976612 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" containerName="nova-scheduler-scheduler" Jan 27 11:40:55 crc kubenswrapper[4775]: I0127 11:40:55.145525 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:55 crc kubenswrapper[4775]: W0127 11:40:55.155138 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724fa5b2_f306_42e9_8781_76a9166bd19e.slice/crio-e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2 WatchSource:0}: Error finding container e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2: Status 404 returned error can't find the container with id e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2 Jan 27 11:40:55 crc kubenswrapper[4775]: I0127 11:40:55.755112 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" path="/var/lib/kubelet/pods/f4deda10-439e-4a94-b215-968b1f49a1f7/volumes" Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.001856 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerStarted","Data":"3fb6dba1ef6aef5504b2fb4bb7d21e98e86e3a8d11057b678b01d97ea7febc53"} Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.001896 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerStarted","Data":"680998a678e870e249e755477f30b2a4504f760bab8f79f38f76f47fa33c362f"} Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.001906 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerStarted","Data":"e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2"} Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.018639 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.018621641 podStartE2EDuration="2.018621641s" podCreationTimestamp="2026-01-27 11:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:56.016769851 +0000 UTC m=+1235.158367628" watchObservedRunningTime="2026-01-27 11:40:56.018621641 +0000 UTC m=+1235.160219418" Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.670081 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:55464->10.217.0.209:8775: read: connection reset by peer" Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.670357 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:55474->10.217.0.209:8775: read: connection reset by peer" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.011377 4775 generic.go:334] "Generic (PLEG): container finished" podID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerID="b680860e2593d7ee3bb455ce65bb0c417d6d9c265106d69c11a3f6d5c337e06f" exitCode=0 Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.012266 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerDied","Data":"b680860e2593d7ee3bb455ce65bb0c417d6d9c265106d69c11a3f6d5c337e06f"} Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.012303 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerDied","Data":"659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab"} Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.012345 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.092298 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.159988 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs\") pod \"7916937d-e997-4d88-8a6b-9fecf57f6828\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.160076 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data\") pod \"7916937d-e997-4d88-8a6b-9fecf57f6828\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.160125 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7swxn\" (UniqueName: \"kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn\") pod \"7916937d-e997-4d88-8a6b-9fecf57f6828\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.160149 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs\") pod \"7916937d-e997-4d88-8a6b-9fecf57f6828\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.160173 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle\") pod \"7916937d-e997-4d88-8a6b-9fecf57f6828\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.160770 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs" (OuterVolumeSpecName: "logs") pod "7916937d-e997-4d88-8a6b-9fecf57f6828" (UID: "7916937d-e997-4d88-8a6b-9fecf57f6828"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.168672 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn" (OuterVolumeSpecName: "kube-api-access-7swxn") pod "7916937d-e997-4d88-8a6b-9fecf57f6828" (UID: "7916937d-e997-4d88-8a6b-9fecf57f6828"). InnerVolumeSpecName "kube-api-access-7swxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.205956 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7916937d-e997-4d88-8a6b-9fecf57f6828" (UID: "7916937d-e997-4d88-8a6b-9fecf57f6828"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.214007 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data" (OuterVolumeSpecName: "config-data") pod "7916937d-e997-4d88-8a6b-9fecf57f6828" (UID: "7916937d-e997-4d88-8a6b-9fecf57f6828"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.253226 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7916937d-e997-4d88-8a6b-9fecf57f6828" (UID: "7916937d-e997-4d88-8a6b-9fecf57f6828"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.262590 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.262619 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.262629 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.262636 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7swxn\" (UniqueName: \"kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.262644 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.034190 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.063584 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.073019 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.088385 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:58 crc kubenswrapper[4775]: E0127 11:40:58.088879 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.088906 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" Jan 27 11:40:58 crc kubenswrapper[4775]: E0127 11:40:58.088939 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.088949 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.089155 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.089189 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.090356 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.093249 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.094538 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.111258 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.178953 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfsp7\" (UniqueName: \"kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.179092 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.179132 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.179209 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.179255 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.280767 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.280884 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfsp7\" (UniqueName: \"kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.280968 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.280991 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.281072 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.281715 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.285135 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.285784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.286196 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.305119 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfsp7\" (UniqueName: \"kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.419794 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.851560 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.994479 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.046043 4775 generic.go:334] "Generic (PLEG): container finished" podID="b76fecdf-e253-454b-8e4e-4c9109834188" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" exitCode=0 Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.046101 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.046890 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b76fecdf-e253-454b-8e4e-4c9109834188","Type":"ContainerDied","Data":"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e"} Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.046984 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b76fecdf-e253-454b-8e4e-4c9109834188","Type":"ContainerDied","Data":"c155607c7d182e5993c92c25a4eb742bffffa860a5fe816b477ae3783c2ec4bb"} Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.047042 4775 scope.go:117] "RemoveContainer" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.047856 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerStarted","Data":"cc025fb49ac4acec92f7c01cfadfc52510ce9eff9a04601bf2e7f2c28847302c"} Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.077981 4775 scope.go:117] "RemoveContainer" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" Jan 27 11:40:59 crc kubenswrapper[4775]: E0127 11:40:59.078780 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e\": container with ID starting with 5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e not found: ID does not exist" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.078895 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e"} err="failed to get container status \"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e\": rpc error: code = NotFound desc = could not find container \"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e\": container with ID starting with 5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e not found: ID does not exist" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.095986 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle\") pod \"b76fecdf-e253-454b-8e4e-4c9109834188\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.096229 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data\") pod \"b76fecdf-e253-454b-8e4e-4c9109834188\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.096418 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mmbm\" (UniqueName: \"kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm\") pod \"b76fecdf-e253-454b-8e4e-4c9109834188\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.102413 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm" (OuterVolumeSpecName: "kube-api-access-9mmbm") pod "b76fecdf-e253-454b-8e4e-4c9109834188" (UID: "b76fecdf-e253-454b-8e4e-4c9109834188"). InnerVolumeSpecName "kube-api-access-9mmbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.130229 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data" (OuterVolumeSpecName: "config-data") pod "b76fecdf-e253-454b-8e4e-4c9109834188" (UID: "b76fecdf-e253-454b-8e4e-4c9109834188"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.131852 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b76fecdf-e253-454b-8e4e-4c9109834188" (UID: "b76fecdf-e253-454b-8e4e-4c9109834188"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.199051 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.199090 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mmbm\" (UniqueName: \"kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.199101 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.378676 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.389980 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.401435 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:59 crc kubenswrapper[4775]: E0127 11:40:59.401872 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" containerName="nova-scheduler-scheduler" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.401890 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" containerName="nova-scheduler-scheduler" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.402077 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" containerName="nova-scheduler-scheduler" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.402674 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.404121 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.412723 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.504400 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.504572 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.504629 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zw2s\" (UniqueName: \"kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.606572 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.606648 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.606679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zw2s\" (UniqueName: \"kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.611748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.612594 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.623056 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zw2s\" (UniqueName: \"kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.724104 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.772600 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" path="/var/lib/kubelet/pods/7916937d-e997-4d88-8a6b-9fecf57f6828/volumes" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.773339 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" path="/var/lib/kubelet/pods/b76fecdf-e253-454b-8e4e-4c9109834188/volumes" Jan 27 11:41:00 crc kubenswrapper[4775]: I0127 11:41:00.059372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerStarted","Data":"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712"} Jan 27 11:41:00 crc kubenswrapper[4775]: I0127 11:41:00.059424 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerStarted","Data":"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81"} Jan 27 11:41:00 crc kubenswrapper[4775]: I0127 11:41:00.083248 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.083229336 podStartE2EDuration="2.083229336s" podCreationTimestamp="2026-01-27 11:40:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:00.081022917 +0000 UTC m=+1239.222620704" watchObservedRunningTime="2026-01-27 11:41:00.083229336 +0000 UTC m=+1239.224827103" Jan 27 11:41:00 crc kubenswrapper[4775]: I0127 11:41:00.183382 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:00 crc kubenswrapper[4775]: W0127 11:41:00.185413 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7600201f_fb6c_4eb7_8b0a_19078b93c131.slice/crio-2771d331e7612b69f7658eb7e84583cbbafb7ca66178894edd5683aab64a88ec WatchSource:0}: Error finding container 2771d331e7612b69f7658eb7e84583cbbafb7ca66178894edd5683aab64a88ec: Status 404 returned error can't find the container with id 2771d331e7612b69f7658eb7e84583cbbafb7ca66178894edd5683aab64a88ec Jan 27 11:41:01 crc kubenswrapper[4775]: I0127 11:41:01.070271 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7600201f-fb6c-4eb7-8b0a-19078b93c131","Type":"ContainerStarted","Data":"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512"} Jan 27 11:41:01 crc kubenswrapper[4775]: I0127 11:41:01.070619 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7600201f-fb6c-4eb7-8b0a-19078b93c131","Type":"ContainerStarted","Data":"2771d331e7612b69f7658eb7e84583cbbafb7ca66178894edd5683aab64a88ec"} Jan 27 11:41:01 crc kubenswrapper[4775]: I0127 11:41:01.112303 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.112283286 podStartE2EDuration="2.112283286s" podCreationTimestamp="2026-01-27 11:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:01.096248637 +0000 UTC m=+1240.237846414" watchObservedRunningTime="2026-01-27 11:41:01.112283286 +0000 UTC m=+1240.253881063" Jan 27 11:41:03 crc kubenswrapper[4775]: I0127 11:41:03.420264 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:41:03 crc kubenswrapper[4775]: I0127 11:41:03.420496 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:41:04 crc kubenswrapper[4775]: I0127 11:41:04.712332 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:41:04 crc kubenswrapper[4775]: I0127 11:41:04.712695 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:41:04 crc kubenswrapper[4775]: I0127 11:41:04.725137 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 11:41:05 crc kubenswrapper[4775]: I0127 11:41:05.724611 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:05 crc kubenswrapper[4775]: I0127 11:41:05.724611 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:08 crc kubenswrapper[4775]: I0127 11:41:08.421092 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:41:08 crc kubenswrapper[4775]: I0127 11:41:08.422574 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:41:09 crc kubenswrapper[4775]: I0127 11:41:09.435628 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:09 crc kubenswrapper[4775]: I0127 11:41:09.435923 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:09 crc kubenswrapper[4775]: I0127 11:41:09.725082 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 11:41:09 crc kubenswrapper[4775]: I0127 11:41:09.772044 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 11:41:10 crc kubenswrapper[4775]: I0127 11:41:10.176039 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 11:41:11 crc kubenswrapper[4775]: I0127 11:41:11.525500 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 27 11:41:14 crc kubenswrapper[4775]: I0127 11:41:14.717744 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:41:14 crc kubenswrapper[4775]: I0127 11:41:14.719216 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:41:14 crc kubenswrapper[4775]: I0127 11:41:14.722765 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:41:14 crc kubenswrapper[4775]: I0127 11:41:14.726549 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:41:15 crc kubenswrapper[4775]: I0127 11:41:15.212066 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:41:15 crc kubenswrapper[4775]: I0127 11:41:15.220347 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:41:18 crc kubenswrapper[4775]: I0127 11:41:18.426107 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:41:18 crc kubenswrapper[4775]: I0127 11:41:18.427618 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:41:18 crc kubenswrapper[4775]: I0127 11:41:18.430788 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:41:19 crc kubenswrapper[4775]: I0127 11:41:19.265019 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.601272 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.601827 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" gracePeriod=30 Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.640717 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.640995 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6113d21a389d355a543441146f4850a74a219c9b51229d74f630ef6722366592" gracePeriod=30 Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.650990 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.651196 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7600201f-fb6c-4eb7-8b0a-19078b93c131" containerName="nova-scheduler-scheduler" containerID="cri-o://a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512" gracePeriod=30 Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.658581 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.750461 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.750876 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-log" containerID="cri-o://680998a678e870e249e755477f30b2a4504f760bab8f79f38f76f47fa33c362f" gracePeriod=30 Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.750957 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-api" containerID="cri-o://3fb6dba1ef6aef5504b2fb4bb7d21e98e86e3a8d11057b678b01d97ea7febc53" gracePeriod=30 Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.285940 4775 generic.go:334] "Generic (PLEG): container finished" podID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerID="680998a678e870e249e755477f30b2a4504f760bab8f79f38f76f47fa33c362f" exitCode=143 Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.286077 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerDied","Data":"680998a678e870e249e755477f30b2a4504f760bab8f79f38f76f47fa33c362f"} Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.289714 4775 generic.go:334] "Generic (PLEG): container finished" podID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" containerID="6113d21a389d355a543441146f4850a74a219c9b51229d74f630ef6722366592" exitCode=0 Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.289759 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6628a06a-9e13-4402-94d9-1df5c42e3c7a","Type":"ContainerDied","Data":"6113d21a389d355a543441146f4850a74a219c9b51229d74f630ef6722366592"} Jan 27 11:41:21 crc kubenswrapper[4775]: E0127 11:41:21.396433 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:21 crc kubenswrapper[4775]: E0127 11:41:21.402210 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:21 crc kubenswrapper[4775]: E0127 11:41:21.403987 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:21 crc kubenswrapper[4775]: E0127 11:41:21.404021 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerName="nova-cell0-conductor-conductor" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.430133 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.618131 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs\") pod \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.618212 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data\") pod \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.618248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs\") pod \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.618317 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cv7d\" (UniqueName: \"kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d\") pod \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.618482 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle\") pod \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.631282 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d" (OuterVolumeSpecName: "kube-api-access-7cv7d") pod "6628a06a-9e13-4402-94d9-1df5c42e3c7a" (UID: "6628a06a-9e13-4402-94d9-1df5c42e3c7a"). InnerVolumeSpecName "kube-api-access-7cv7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.647554 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data" (OuterVolumeSpecName: "config-data") pod "6628a06a-9e13-4402-94d9-1df5c42e3c7a" (UID: "6628a06a-9e13-4402-94d9-1df5c42e3c7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.658909 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6628a06a-9e13-4402-94d9-1df5c42e3c7a" (UID: "6628a06a-9e13-4402-94d9-1df5c42e3c7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.691052 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "6628a06a-9e13-4402-94d9-1df5c42e3c7a" (UID: "6628a06a-9e13-4402-94d9-1df5c42e3c7a"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.706791 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "6628a06a-9e13-4402-94d9-1df5c42e3c7a" (UID: "6628a06a-9e13-4402-94d9-1df5c42e3c7a"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.728575 4775 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.728618 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.728630 4775 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.728651 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cv7d\" (UniqueName: \"kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.728660 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.829278 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.931377 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data\") pod \"7600201f-fb6c-4eb7-8b0a-19078b93c131\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.931446 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zw2s\" (UniqueName: \"kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s\") pod \"7600201f-fb6c-4eb7-8b0a-19078b93c131\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.931537 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle\") pod \"7600201f-fb6c-4eb7-8b0a-19078b93c131\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.935767 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s" (OuterVolumeSpecName: "kube-api-access-5zw2s") pod "7600201f-fb6c-4eb7-8b0a-19078b93c131" (UID: "7600201f-fb6c-4eb7-8b0a-19078b93c131"). InnerVolumeSpecName "kube-api-access-5zw2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.966181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data" (OuterVolumeSpecName: "config-data") pod "7600201f-fb6c-4eb7-8b0a-19078b93c131" (UID: "7600201f-fb6c-4eb7-8b0a-19078b93c131"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.966782 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7600201f-fb6c-4eb7-8b0a-19078b93c131" (UID: "7600201f-fb6c-4eb7-8b0a-19078b93c131"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.036009 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.036104 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zw2s\" (UniqueName: \"kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.036129 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.304026 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6628a06a-9e13-4402-94d9-1df5c42e3c7a","Type":"ContainerDied","Data":"fafac5d47be64962872bf10acf0347810c872eb880366bed3a34d442b4601ca2"} Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.304082 4775 scope.go:117] "RemoveContainer" containerID="6113d21a389d355a543441146f4850a74a219c9b51229d74f630ef6722366592" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.304219 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.308227 4775 generic.go:334] "Generic (PLEG): container finished" podID="7600201f-fb6c-4eb7-8b0a-19078b93c131" containerID="a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512" exitCode=0 Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.308290 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7600201f-fb6c-4eb7-8b0a-19078b93c131","Type":"ContainerDied","Data":"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512"} Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.308303 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.308323 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7600201f-fb6c-4eb7-8b0a-19078b93c131","Type":"ContainerDied","Data":"2771d331e7612b69f7658eb7e84583cbbafb7ca66178894edd5683aab64a88ec"} Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.308442 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" containerID="cri-o://c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81" gracePeriod=30 Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.309487 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" containerID="cri-o://75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712" gracePeriod=30 Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.345616 4775 scope.go:117] "RemoveContainer" containerID="a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.367028 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.405745 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.406542 4775 scope.go:117] "RemoveContainer" containerID="a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512" Jan 27 11:41:22 crc kubenswrapper[4775]: E0127 11:41:22.417175 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512\": container with ID starting with a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512 not found: ID does not exist" containerID="a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.417529 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512"} err="failed to get container status \"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512\": rpc error: code = NotFound desc = could not find container \"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512\": container with ID starting with a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512 not found: ID does not exist" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.426505 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.441400 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.460471 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: E0127 11:41:22.461102 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7600201f-fb6c-4eb7-8b0a-19078b93c131" containerName="nova-scheduler-scheduler" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.461124 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7600201f-fb6c-4eb7-8b0a-19078b93c131" containerName="nova-scheduler-scheduler" Jan 27 11:41:22 crc kubenswrapper[4775]: E0127 11:41:22.461150 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.461159 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.461377 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.461410 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7600201f-fb6c-4eb7-8b0a-19078b93c131" containerName="nova-scheduler-scheduler" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.462183 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.464715 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.465409 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.465833 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.478357 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.479801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.483647 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.491725 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.515199 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649366 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649545 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649578 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649681 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649701 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8gnp\" (UniqueName: \"kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649721 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9lv6\" (UniqueName: \"kubernetes.io/projected/80ce7ac7-056a-44ec-be77-f87a96dc23f5-kube-api-access-n9lv6\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649756 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751506 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751570 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8gnp\" (UniqueName: \"kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751594 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9lv6\" (UniqueName: \"kubernetes.io/projected/80ce7ac7-056a-44ec-be77-f87a96dc23f5-kube-api-access-n9lv6\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751653 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751715 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751841 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751879 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.757206 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.758869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.760013 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.760747 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.765020 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.765939 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.768102 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9lv6\" (UniqueName: \"kubernetes.io/projected/80ce7ac7-056a-44ec-be77-f87a96dc23f5-kube-api-access-n9lv6\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.771076 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8gnp\" (UniqueName: \"kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.791953 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.797432 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.260358 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.274968 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.319433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8431139c-b870-4787-9a1c-758e9241e776","Type":"ContainerStarted","Data":"448721b3663c08d269367448c16aa9457b184a7bc00f3668a17c4a9972f25155"} Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.325061 4775 generic.go:334] "Generic (PLEG): container finished" podID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerID="c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81" exitCode=143 Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.325149 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerDied","Data":"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81"} Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.326220 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"80ce7ac7-056a-44ec-be77-f87a96dc23f5","Type":"ContainerStarted","Data":"0c168ba474b47e1c1b0eb32b31e2979ca59877a54a0bd3aeb7d60c7138384b25"} Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.754804 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" path="/var/lib/kubelet/pods/6628a06a-9e13-4402-94d9-1df5c42e3c7a/volumes" Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.755659 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7600201f-fb6c-4eb7-8b0a-19078b93c131" path="/var/lib/kubelet/pods/7600201f-fb6c-4eb7-8b0a-19078b93c131/volumes" Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.965309 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:23 crc kubenswrapper[4775]: W0127 11:41:23.974809 4775 container.go:586] Failed to update stats for container "/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724fa5b2_f306_42e9_8781_76a9166bd19e.slice/crio-e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2": error while statting cgroup v2: [read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724fa5b2_f306_42e9_8781_76a9166bd19e.slice/crio-e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2/pids.current: no such device], continuing to push stats Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.063440 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.063661 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" gracePeriod=30 Jan 27 11:41:24 crc kubenswrapper[4775]: E0127 11:41:24.238133 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724fa5b2_f306_42e9_8781_76a9166bd19e.slice/crio-e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2\": RecentStats: unable to find data in memory cache]" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.344331 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"80ce7ac7-056a-44ec-be77-f87a96dc23f5","Type":"ContainerStarted","Data":"0d6942f38537a5c467895adb33195c2af219a87282909e2d50ea799fbfcfabbb"} Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.346633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8431139c-b870-4787-9a1c-758e9241e776","Type":"ContainerStarted","Data":"876d516959295d7e0db711e27a3980ced858832560adced1e7a9b9f0d697bf7f"} Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.348555 4775 generic.go:334] "Generic (PLEG): container finished" podID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerID="3fb6dba1ef6aef5504b2fb4bb7d21e98e86e3a8d11057b678b01d97ea7febc53" exitCode=0 Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.348637 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerDied","Data":"3fb6dba1ef6aef5504b2fb4bb7d21e98e86e3a8d11057b678b01d97ea7febc53"} Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.348689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerDied","Data":"e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2"} Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.348702 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.362821 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.362804146 podStartE2EDuration="2.362804146s" podCreationTimestamp="2026-01-27 11:41:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:24.357495401 +0000 UTC m=+1263.499093188" watchObservedRunningTime="2026-01-27 11:41:24.362804146 +0000 UTC m=+1263.504401923" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.365247 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.384372 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.384351076 podStartE2EDuration="2.384351076s" podCreationTimestamp="2026-01-27 11:41:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:24.376662016 +0000 UTC m=+1263.518259813" watchObservedRunningTime="2026-01-27 11:41:24.384351076 +0000 UTC m=+1263.525948863" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487483 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487535 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487581 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487656 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487752 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48lz2\" (UniqueName: \"kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487804 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.490413 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs" (OuterVolumeSpecName: "logs") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.496719 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2" (OuterVolumeSpecName: "kube-api-access-48lz2") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "kube-api-access-48lz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.524686 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data" (OuterVolumeSpecName: "config-data") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.527900 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.588401 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.591593 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48lz2\" (UniqueName: \"kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.591624 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.591634 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.591644 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.591652 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.614412 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.693285 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: E0127 11:41:24.903878 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:24 crc kubenswrapper[4775]: E0127 11:41:24.905175 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:24 crc kubenswrapper[4775]: E0127 11:41:24.906241 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:24 crc kubenswrapper[4775]: E0127 11:41:24.906285 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerName="nova-cell1-conductor-conductor" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.360413 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362306 4775 generic.go:334] "Generic (PLEG): container finished" podID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" exitCode=0 Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362423 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362420 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7617063e-fa32-45fc-b06e-7ecff629f7db","Type":"ContainerDied","Data":"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7"} Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362552 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8431139c-b870-4787-9a1c-758e9241e776" containerName="nova-scheduler-scheduler" containerID="cri-o://876d516959295d7e0db711e27a3980ced858832560adced1e7a9b9f0d697bf7f" gracePeriod=30 Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362615 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7617063e-fa32-45fc-b06e-7ecff629f7db","Type":"ContainerDied","Data":"7366e2b06b8dfe620b743759b8a53259302cbfecadc69c376be4bc38237a72e8"} Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362641 4775 scope.go:117] "RemoveContainer" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.397624 4775 scope.go:117] "RemoveContainer" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" Jan 27 11:41:25 crc kubenswrapper[4775]: E0127 11:41:25.399836 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7\": container with ID starting with 7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7 not found: ID does not exist" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.399884 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7"} err="failed to get container status \"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7\": rpc error: code = NotFound desc = could not find container \"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7\": container with ID starting with 7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7 not found: ID does not exist" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.440835 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.453121 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463029 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:25 crc kubenswrapper[4775]: E0127 11:41:25.463568 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerName="nova-cell0-conductor-conductor" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463593 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerName="nova-cell0-conductor-conductor" Jan 27 11:41:25 crc kubenswrapper[4775]: E0127 11:41:25.463626 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-log" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463635 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-log" Jan 27 11:41:25 crc kubenswrapper[4775]: E0127 11:41:25.463648 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-api" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463656 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-api" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463879 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerName="nova-cell0-conductor-conductor" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463908 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-log" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463927 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-api" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.465115 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.468352 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.468606 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.468763 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.473947 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.507558 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data\") pod \"7617063e-fa32-45fc-b06e-7ecff629f7db\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.507631 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle\") pod \"7617063e-fa32-45fc-b06e-7ecff629f7db\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.507788 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz4nd\" (UniqueName: \"kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd\") pod \"7617063e-fa32-45fc-b06e-7ecff629f7db\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.520308 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd" (OuterVolumeSpecName: "kube-api-access-cz4nd") pod "7617063e-fa32-45fc-b06e-7ecff629f7db" (UID: "7617063e-fa32-45fc-b06e-7ecff629f7db"). InnerVolumeSpecName "kube-api-access-cz4nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.533715 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7617063e-fa32-45fc-b06e-7ecff629f7db" (UID: "7617063e-fa32-45fc-b06e-7ecff629f7db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.537704 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data" (OuterVolumeSpecName: "config-data") pod "7617063e-fa32-45fc-b06e-7ecff629f7db" (UID: "7617063e-fa32-45fc-b06e-7ecff629f7db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610097 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/451ba9e3-91a7-4fd5-9e95-b827186dee9d-logs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610184 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66ggw\" (UniqueName: \"kubernetes.io/projected/451ba9e3-91a7-4fd5-9e95-b827186dee9d-kube-api-access-66ggw\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610429 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610593 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610665 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-config-data\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610823 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.611003 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.611023 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.611036 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz4nd\" (UniqueName: \"kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712373 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-config-data\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712404 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712430 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/451ba9e3-91a7-4fd5-9e95-b827186dee9d-logs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66ggw\" (UniqueName: \"kubernetes.io/projected/451ba9e3-91a7-4fd5-9e95-b827186dee9d-kube-api-access-66ggw\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.713410 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/451ba9e3-91a7-4fd5-9e95-b827186dee9d-logs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.716667 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-config-data\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.716699 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.717778 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.717924 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.734477 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66ggw\" (UniqueName: \"kubernetes.io/projected/451ba9e3-91a7-4fd5-9e95-b827186dee9d-kube-api-access-66ggw\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.756206 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" path="/var/lib/kubelet/pods/724fa5b2-f306-42e9-8781-76a9166bd19e/volumes" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.760662 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": read tcp 10.217.0.2:54928->10.217.0.221:8775: read: connection reset by peer" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.760713 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": read tcp 10.217.0.2:54930->10.217.0.221:8775: read: connection reset by peer" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.789905 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.197136 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.289811 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.323479 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs\") pod \"3d743fc7-b5d1-4890-bc22-22de8227323e\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.323630 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle\") pod \"3d743fc7-b5d1-4890-bc22-22de8227323e\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.323667 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs\") pod \"3d743fc7-b5d1-4890-bc22-22de8227323e\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.323735 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data\") pod \"3d743fc7-b5d1-4890-bc22-22de8227323e\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.323797 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfsp7\" (UniqueName: \"kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7\") pod \"3d743fc7-b5d1-4890-bc22-22de8227323e\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.326795 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs" (OuterVolumeSpecName: "logs") pod "3d743fc7-b5d1-4890-bc22-22de8227323e" (UID: "3d743fc7-b5d1-4890-bc22-22de8227323e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.330306 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7" (OuterVolumeSpecName: "kube-api-access-tfsp7") pod "3d743fc7-b5d1-4890-bc22-22de8227323e" (UID: "3d743fc7-b5d1-4890-bc22-22de8227323e"). InnerVolumeSpecName "kube-api-access-tfsp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.357581 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data" (OuterVolumeSpecName: "config-data") pod "3d743fc7-b5d1-4890-bc22-22de8227323e" (UID: "3d743fc7-b5d1-4890-bc22-22de8227323e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.366566 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d743fc7-b5d1-4890-bc22-22de8227323e" (UID: "3d743fc7-b5d1-4890-bc22-22de8227323e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.372738 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.375309 4775 generic.go:334] "Generic (PLEG): container finished" podID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerID="75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712" exitCode=0 Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.375367 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerDied","Data":"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712"} Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.375390 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerDied","Data":"cc025fb49ac4acec92f7c01cfadfc52510ce9eff9a04601bf2e7f2c28847302c"} Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.375392 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.375408 4775 scope.go:117] "RemoveContainer" containerID="75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.378823 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"451ba9e3-91a7-4fd5-9e95-b827186dee9d","Type":"ContainerStarted","Data":"a20713d4384bfaba787f9cde797a335807fd0a15bcdeb3c72de591e49ae0c218"} Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.386435 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3d743fc7-b5d1-4890-bc22-22de8227323e" (UID: "3d743fc7-b5d1-4890-bc22-22de8227323e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.417105 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.423140 4775 scope.go:117] "RemoveContainer" containerID="c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.425479 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfsp7\" (UniqueName: \"kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.425507 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.425518 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.425527 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.425540 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.432596 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.441827 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: E0127 11:41:26.442240 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.442258 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" Jan 27 11:41:26 crc kubenswrapper[4775]: E0127 11:41:26.442291 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.442298 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.442462 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.442479 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.443173 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.448306 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.453479 4775 scope.go:117] "RemoveContainer" containerID="75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712" Jan 27 11:41:26 crc kubenswrapper[4775]: E0127 11:41:26.454532 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712\": container with ID starting with 75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712 not found: ID does not exist" containerID="75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.454585 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712"} err="failed to get container status \"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712\": rpc error: code = NotFound desc = could not find container \"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712\": container with ID starting with 75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712 not found: ID does not exist" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.454614 4775 scope.go:117] "RemoveContainer" containerID="c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81" Jan 27 11:41:26 crc kubenswrapper[4775]: E0127 11:41:26.454986 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81\": container with ID starting with c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81 not found: ID does not exist" containerID="c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.455016 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81"} err="failed to get container status \"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81\": rpc error: code = NotFound desc = could not find container \"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81\": container with ID starting with c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81 not found: ID does not exist" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.455302 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.628328 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.628592 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28r5f\" (UniqueName: \"kubernetes.io/projected/21548904-8b74-4b9b-81fb-df04e62dc7df-kube-api-access-28r5f\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.628656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.706023 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.715231 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.730300 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.730414 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28r5f\" (UniqueName: \"kubernetes.io/projected/21548904-8b74-4b9b-81fb-df04e62dc7df-kube-api-access-28r5f\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.730471 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.730827 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.735849 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.736265 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.744901 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.745020 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.756997 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.757326 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.761874 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28r5f\" (UniqueName: \"kubernetes.io/projected/21548904-8b74-4b9b-81fb-df04e62dc7df-kube-api-access-28r5f\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.774018 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.832715 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-config-data\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.832791 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-logs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.832852 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.832934 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4b26\" (UniqueName: \"kubernetes.io/projected/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-kube-api-access-z4b26\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.832964 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.934395 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-config-data\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.934708 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-logs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.934753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.934809 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4b26\" (UniqueName: \"kubernetes.io/projected/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-kube-api-access-z4b26\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.934835 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.935881 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-logs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.939798 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.939809 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.940027 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-config-data\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.955061 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4b26\" (UniqueName: \"kubernetes.io/projected/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-kube-api-access-z4b26\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.075632 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.232558 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.312808 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:27 crc kubenswrapper[4775]: W0127 11:41:27.316528 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb95ff32a_7b7f_43d8_b521_6d07c8d78c99.slice/crio-20d0b07ebde424bc8b7c2b07a218d1626f9a5f95fe8cbaba17b298d523aa75c9 WatchSource:0}: Error finding container 20d0b07ebde424bc8b7c2b07a218d1626f9a5f95fe8cbaba17b298d523aa75c9: Status 404 returned error can't find the container with id 20d0b07ebde424bc8b7c2b07a218d1626f9a5f95fe8cbaba17b298d523aa75c9 Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.389426 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b95ff32a-7b7f-43d8-b521-6d07c8d78c99","Type":"ContainerStarted","Data":"20d0b07ebde424bc8b7c2b07a218d1626f9a5f95fe8cbaba17b298d523aa75c9"} Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.392088 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"21548904-8b74-4b9b-81fb-df04e62dc7df","Type":"ContainerStarted","Data":"83f9f6683189e5a3d7fdfa25e43bb3cf9538df89446775cf2db0019019ccadca"} Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.398026 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"451ba9e3-91a7-4fd5-9e95-b827186dee9d","Type":"ContainerStarted","Data":"1faaf895137c94f1f4724ab67c10eea2055f0e7b9a65e0befb41b89a169cfcde"} Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.398066 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"451ba9e3-91a7-4fd5-9e95-b827186dee9d","Type":"ContainerStarted","Data":"bd11b7d1772192a1c787c50ef54d4452b93c364617b584dc04bf19de103c2092"} Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.420857 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.420840827 podStartE2EDuration="2.420840827s" podCreationTimestamp="2026-01-27 11:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:27.418629326 +0000 UTC m=+1266.560227103" watchObservedRunningTime="2026-01-27 11:41:27.420840827 +0000 UTC m=+1266.562438604" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.758315 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" path="/var/lib/kubelet/pods/3d743fc7-b5d1-4890-bc22-22de8227323e/volumes" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.759581 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" path="/var/lib/kubelet/pods/7617063e-fa32-45fc-b06e-7ecff629f7db/volumes" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.792879 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.798550 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.408303 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"21548904-8b74-4b9b-81fb-df04e62dc7df","Type":"ContainerStarted","Data":"495e47ab9dbb841fe45b54288e3a3a9b08b1650f9196643c2c010473caf3db1f"} Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.408426 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.410621 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b95ff32a-7b7f-43d8-b521-6d07c8d78c99","Type":"ContainerStarted","Data":"721af181f4ec9c0e5860b71c7f952716e6c800979483f969a6c73597a138efac"} Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.410702 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b95ff32a-7b7f-43d8-b521-6d07c8d78c99","Type":"ContainerStarted","Data":"1dd5f8c2cba5b6c42ddbf68ee4c0d313f4a062d523c00451a1df61b5ad197c22"} Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.449405 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.449385483 podStartE2EDuration="2.449385483s" podCreationTimestamp="2026-01-27 11:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:28.427486023 +0000 UTC m=+1267.569083810" watchObservedRunningTime="2026-01-27 11:41:28.449385483 +0000 UTC m=+1267.590983260" Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.479628 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.479607169 podStartE2EDuration="2.479607169s" podCreationTimestamp="2026-01-27 11:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:28.46499199 +0000 UTC m=+1267.606589767" watchObservedRunningTime="2026-01-27 11:41:28.479607169 +0000 UTC m=+1267.621204946" Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.487828 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.042801 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.180780 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle\") pod \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.181246 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data\") pod \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.181379 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb2kw\" (UniqueName: \"kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw\") pod \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.200654 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw" (OuterVolumeSpecName: "kube-api-access-fb2kw") pod "f2945fbf-3178-420a-bfaf-d0d9c91d610a" (UID: "f2945fbf-3178-420a-bfaf-d0d9c91d610a"). InnerVolumeSpecName "kube-api-access-fb2kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.215190 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2945fbf-3178-420a-bfaf-d0d9c91d610a" (UID: "f2945fbf-3178-420a-bfaf-d0d9c91d610a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.222060 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data" (OuterVolumeSpecName: "config-data") pod "f2945fbf-3178-420a-bfaf-d0d9c91d610a" (UID: "f2945fbf-3178-420a-bfaf-d0d9c91d610a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.283779 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.283828 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.283837 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb2kw\" (UniqueName: \"kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.419894 4775 generic.go:334] "Generic (PLEG): container finished" podID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" exitCode=0 Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.419956 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f2945fbf-3178-420a-bfaf-d0d9c91d610a","Type":"ContainerDied","Data":"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a"} Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.420016 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f2945fbf-3178-420a-bfaf-d0d9c91d610a","Type":"ContainerDied","Data":"f094ab71659251ae7c395f5253917a11c7fff1315a6946167b9b612c28b6876f"} Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.420038 4775 scope.go:117] "RemoveContainer" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.420254 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.442341 4775 scope.go:117] "RemoveContainer" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" Jan 27 11:41:29 crc kubenswrapper[4775]: E0127 11:41:29.445010 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a\": container with ID starting with 178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a not found: ID does not exist" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.445053 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a"} err="failed to get container status \"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a\": rpc error: code = NotFound desc = could not find container \"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a\": container with ID starting with 178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a not found: ID does not exist" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.455092 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.463249 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.476516 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:29 crc kubenswrapper[4775]: E0127 11:41:29.476958 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerName="nova-cell1-conductor-conductor" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.476980 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerName="nova-cell1-conductor-conductor" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.477336 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerName="nova-cell1-conductor-conductor" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.478023 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.483273 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.490313 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.587631 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.587882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rppqf\" (UniqueName: \"kubernetes.io/projected/c8d213b2-8a0b-479c-8c94-148f1afe1db0-kube-api-access-rppqf\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.588393 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.690265 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rppqf\" (UniqueName: \"kubernetes.io/projected/c8d213b2-8a0b-479c-8c94-148f1afe1db0-kube-api-access-rppqf\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.690372 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.690428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.693536 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.693658 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.710836 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rppqf\" (UniqueName: \"kubernetes.io/projected/c8d213b2-8a0b-479c-8c94-148f1afe1db0-kube-api-access-rppqf\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.756563 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" path="/var/lib/kubelet/pods/f2945fbf-3178-420a-bfaf-d0d9c91d610a/volumes" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.792715 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.003706 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.149044 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:30 crc kubenswrapper[4775]: W0127 11:41:30.155154 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8d213b2_8a0b_479c_8c94_148f1afe1db0.slice/crio-6ffc2485d9cc231a0b219b706de052e166146ecd42300826ca47898c80157335 WatchSource:0}: Error finding container 6ffc2485d9cc231a0b219b706de052e166146ecd42300826ca47898c80157335: Status 404 returned error can't find the container with id 6ffc2485d9cc231a0b219b706de052e166146ecd42300826ca47898c80157335 Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.429007 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c8d213b2-8a0b-479c-8c94-148f1afe1db0","Type":"ContainerStarted","Data":"b8e639c27679e58dbb9eace93f97e8e6619add6f6af8e248118196a362f0cd3c"} Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.429315 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c8d213b2-8a0b-479c-8c94-148f1afe1db0","Type":"ContainerStarted","Data":"6ffc2485d9cc231a0b219b706de052e166146ecd42300826ca47898c80157335"} Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.429530 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.449493 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.449472553 podStartE2EDuration="1.449472553s" podCreationTimestamp="2026-01-27 11:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:30.444040204 +0000 UTC m=+1269.585637991" watchObservedRunningTime="2026-01-27 11:41:30.449472553 +0000 UTC m=+1269.591070330" Jan 27 11:41:32 crc kubenswrapper[4775]: I0127 11:41:32.077128 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:41:32 crc kubenswrapper[4775]: I0127 11:41:32.078487 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:41:32 crc kubenswrapper[4775]: I0127 11:41:32.792226 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:32 crc kubenswrapper[4775]: I0127 11:41:32.833366 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:33 crc kubenswrapper[4775]: I0127 11:41:33.099054 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="rabbitmq" containerID="cri-o://0bbda45d64c3d5291022cfefd67ac29a65fcce1e708b8976ccb1047b144eacb1" gracePeriod=604796 Jan 27 11:41:33 crc kubenswrapper[4775]: I0127 11:41:33.468330 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:33 crc kubenswrapper[4775]: I0127 11:41:33.948886 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="rabbitmq" containerID="cri-o://d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b" gracePeriod=604797 Jan 27 11:41:35 crc kubenswrapper[4775]: I0127 11:41:35.790734 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:41:35 crc kubenswrapper[4775]: I0127 11:41:35.791223 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:41:36 crc kubenswrapper[4775]: I0127 11:41:36.804688 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="451ba9e3-91a7-4fd5-9e95-b827186dee9d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:36 crc kubenswrapper[4775]: I0127 11:41:36.804745 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="451ba9e3-91a7-4fd5-9e95-b827186dee9d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:36 crc kubenswrapper[4775]: I0127 11:41:36.818003 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:37 crc kubenswrapper[4775]: I0127 11:41:37.076609 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:41:37 crc kubenswrapper[4775]: I0127 11:41:37.077252 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:41:38 crc kubenswrapper[4775]: I0127 11:41:38.095654 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b95ff32a-7b7f-43d8-b521-6d07c8d78c99" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:38 crc kubenswrapper[4775]: I0127 11:41:38.095685 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b95ff32a-7b7f-43d8-b521-6d07c8d78c99" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.513089 4775 generic.go:334] "Generic (PLEG): container finished" podID="01ba029b-2296-4519-b6b1-04674355258f" containerID="0bbda45d64c3d5291022cfefd67ac29a65fcce1e708b8976ccb1047b144eacb1" exitCode=0 Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.513308 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerDied","Data":"0bbda45d64c3d5291022cfefd67ac29a65fcce1e708b8976ccb1047b144eacb1"} Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.780463 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.851188 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.856067 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891368 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891425 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891465 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891491 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891582 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891685 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891724 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891745 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891778 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891815 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwnfd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891873 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.893731 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.894684 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.894703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.902024 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.904089 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.915168 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd" (OuterVolumeSpecName: "kube-api-access-mwnfd") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "kube-api-access-mwnfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.915460 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info" (OuterVolumeSpecName: "pod-info") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.915320 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.961406 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data" (OuterVolumeSpecName: "config-data") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.973999 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf" (OuterVolumeSpecName: "server-conf") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999620 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwnfd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999652 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999661 4775 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999677 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999686 4775 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999695 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999704 4775 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999712 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999720 4775 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999728 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.017173 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.031576 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.101848 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.101887 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.400318 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414044 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414167 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414188 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414227 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414266 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414318 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414394 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414424 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414486 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rgjg\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414528 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.415941 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.416175 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.416290 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.416889 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.420593 4775 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.420608 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.420518 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.420947 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info" (OuterVolumeSpecName: "pod-info") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.421135 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.421318 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.424164 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg" (OuterVolumeSpecName: "kube-api-access-2rgjg") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "kube-api-access-2rgjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.462536 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data" (OuterVolumeSpecName: "config-data") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.512173 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf" (OuterVolumeSpecName: "server-conf") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523788 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523812 4775 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523821 4775 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523830 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523850 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523860 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rgjg\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523869 4775 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.524589 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerDied","Data":"3269a97665006c13d48ba616c9cd7abaebd71e3a1886cb0e13cd8dcf70fd57ec"} Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.524717 4775 scope.go:117] "RemoveContainer" containerID="0bbda45d64c3d5291022cfefd67ac29a65fcce1e708b8976ccb1047b144eacb1" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.524891 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.539213 4775 generic.go:334] "Generic (PLEG): container finished" podID="83263987-4e3c-4e95-9083-bb6a43f52410" containerID="d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b" exitCode=0 Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.539261 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.539279 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerDied","Data":"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b"} Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.540088 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerDied","Data":"85a690e91079df6f4fe47bd15cd231753c08767dae9db9e6943a0ce49bec3588"} Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.543983 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.560275 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.615885 4775 scope.go:117] "RemoveContainer" containerID="74bb5b1c930971f4fe9c5d05e3295a42d673f050d9c75ec7b42c0aa8e59510ca" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.617424 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.625047 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.625241 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.636584 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.655869 4775 scope.go:117] "RemoveContainer" containerID="d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.713561 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.713992 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.714007 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.714032 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="setup-container" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.714040 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="setup-container" Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.714050 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="setup-container" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.714059 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="setup-container" Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.714087 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.714094 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.717028 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.717067 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.718555 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723063 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723306 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-44htb" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723320 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723344 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723351 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723370 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723709 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.735272 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.745308 4775 scope.go:117] "RemoveContainer" containerID="235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.783393 4775 scope.go:117] "RemoveContainer" containerID="d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b" Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.783836 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b\": container with ID starting with d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b not found: ID does not exist" containerID="d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.783880 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b"} err="failed to get container status \"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b\": rpc error: code = NotFound desc = could not find container \"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b\": container with ID starting with d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b not found: ID does not exist" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.783908 4775 scope.go:117] "RemoveContainer" containerID="235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55" Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.784219 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55\": container with ID starting with 235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55 not found: ID does not exist" containerID="235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.784259 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55"} err="failed to get container status \"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55\": rpc error: code = NotFound desc = could not find container \"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55\": container with ID starting with 235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55 not found: ID does not exist" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829561 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829690 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829756 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-config-data\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829796 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829812 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829857 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt44s\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-kube-api-access-jt44s\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829922 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c46c48a-ba77-4494-bc4e-f463a4072952-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829945 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c46c48a-ba77-4494-bc4e-f463a4072952-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829967 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.830001 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.830025 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.871362 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.879343 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.898733 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.900801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.902849 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.903009 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gp9fv" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.902885 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.903318 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.903431 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.903660 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.904392 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.918873 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934276 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934350 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-config-data\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934410 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934429 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934479 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt44s\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-kube-api-access-jt44s\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934525 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c46c48a-ba77-4494-bc4e-f463a4072952-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c46c48a-ba77-4494-bc4e-f463a4072952-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934576 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934606 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.936270 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.936432 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.937748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.937829 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-config-data\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.938660 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.939787 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.940216 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c46c48a-ba77-4494-bc4e-f463a4072952-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.941961 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c46c48a-ba77-4494-bc4e-f463a4072952-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.943480 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.946306 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.954150 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt44s\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-kube-api-access-jt44s\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.987191 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036688 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036732 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036758 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036799 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036829 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036848 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036873 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqqz8\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-kube-api-access-mqqz8\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036894 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036933 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036987 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.037008 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.048514 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.139639 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.139934 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.139961 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140266 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140675 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140836 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140866 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140897 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqqz8\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-kube-api-access-mqqz8\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140955 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.141072 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.141223 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.142305 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.142871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.144180 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.144801 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.144867 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.145721 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.155123 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.155120 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.162041 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqqz8\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-kube-api-access-mqqz8\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.170484 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.222349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.490627 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.551754 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6c46c48a-ba77-4494-bc4e-f463a4072952","Type":"ContainerStarted","Data":"a1e55a1eea034ce3d2707a029a15d3cb21215a7a1edf9f4c3f4c4b2e615390a5"} Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.691620 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.764638 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ba029b-2296-4519-b6b1-04674355258f" path="/var/lib/kubelet/pods/01ba029b-2296-4519-b6b1-04674355258f/volumes" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.765976 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" path="/var/lib/kubelet/pods/83263987-4e3c-4e95-9083-bb6a43f52410/volumes" Jan 27 11:41:42 crc kubenswrapper[4775]: I0127 11:41:42.563706 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d","Type":"ContainerStarted","Data":"325be91f95532ec391b58080d6515074fcc561c3699a2176a132f7fad241a067"} Jan 27 11:41:43 crc kubenswrapper[4775]: I0127 11:41:43.577726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6c46c48a-ba77-4494-bc4e-f463a4072952","Type":"ContainerStarted","Data":"36fdf46333226ef36a60bae5ba2567a2bed7c60248a3525289f10e463659609d"} Jan 27 11:41:43 crc kubenswrapper[4775]: I0127 11:41:43.579858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d","Type":"ContainerStarted","Data":"81eaee1031536e651f818c97c18baa543f83a2f9dd9e2588f54dca81587b369b"} Jan 27 11:41:44 crc kubenswrapper[4775]: I0127 11:41:44.563151 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.94:5671: i/o timeout" Jan 27 11:41:45 crc kubenswrapper[4775]: I0127 11:41:45.798966 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:41:45 crc kubenswrapper[4775]: I0127 11:41:45.799743 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:41:45 crc kubenswrapper[4775]: I0127 11:41:45.800655 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:41:45 crc kubenswrapper[4775]: I0127 11:41:45.805875 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:41:46 crc kubenswrapper[4775]: I0127 11:41:46.606120 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:41:46 crc kubenswrapper[4775]: I0127 11:41:46.613287 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:41:47 crc kubenswrapper[4775]: I0127 11:41:47.080856 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:41:47 crc kubenswrapper[4775]: I0127 11:41:47.082916 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:41:47 crc kubenswrapper[4775]: I0127 11:41:47.091500 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:41:47 crc kubenswrapper[4775]: I0127 11:41:47.618716 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.827334 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.844640 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.850129 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.858103 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922350 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922431 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922471 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922590 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922675 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922731 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp58g\" (UniqueName: \"kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922796 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024416 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024498 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024542 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024613 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp58g\" (UniqueName: \"kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024634 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.025593 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.025599 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.025867 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.025981 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.026103 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.026115 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.052167 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp58g\" (UniqueName: \"kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.179700 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.602937 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.638831 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" event={"ID":"b5dacfeb-690d-4289-ae2d-0123e4435d4a","Type":"ContainerStarted","Data":"df8f05a4df2923539bd706608b950d7b669b477c0ab0a1ec9d75d5d196841bbe"} Jan 27 11:41:50 crc kubenswrapper[4775]: I0127 11:41:50.651305 4775 generic.go:334] "Generic (PLEG): container finished" podID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerID="65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1" exitCode=0 Jan 27 11:41:50 crc kubenswrapper[4775]: I0127 11:41:50.651427 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" event={"ID":"b5dacfeb-690d-4289-ae2d-0123e4435d4a","Type":"ContainerDied","Data":"65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1"} Jan 27 11:41:51 crc kubenswrapper[4775]: I0127 11:41:51.660932 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" event={"ID":"b5dacfeb-690d-4289-ae2d-0123e4435d4a","Type":"ContainerStarted","Data":"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7"} Jan 27 11:41:51 crc kubenswrapper[4775]: I0127 11:41:51.661318 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:51 crc kubenswrapper[4775]: I0127 11:41:51.692268 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" podStartSLOduration=3.692243863 podStartE2EDuration="3.692243863s" podCreationTimestamp="2026-01-27 11:41:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:51.6815763 +0000 UTC m=+1290.823174117" watchObservedRunningTime="2026-01-27 11:41:51.692243863 +0000 UTC m=+1290.833841680" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.696986 4775 generic.go:334] "Generic (PLEG): container finished" podID="8431139c-b870-4787-9a1c-758e9241e776" containerID="876d516959295d7e0db711e27a3980ced858832560adced1e7a9b9f0d697bf7f" exitCode=137 Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.697087 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8431139c-b870-4787-9a1c-758e9241e776","Type":"ContainerDied","Data":"876d516959295d7e0db711e27a3980ced858832560adced1e7a9b9f0d697bf7f"} Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.697620 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8431139c-b870-4787-9a1c-758e9241e776","Type":"ContainerDied","Data":"448721b3663c08d269367448c16aa9457b184a7bc00f3668a17c4a9972f25155"} Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.697637 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="448721b3663c08d269367448c16aa9457b184a7bc00f3668a17c4a9972f25155" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.746052 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.868288 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data\") pod \"8431139c-b870-4787-9a1c-758e9241e776\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.868394 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8gnp\" (UniqueName: \"kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp\") pod \"8431139c-b870-4787-9a1c-758e9241e776\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.868420 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle\") pod \"8431139c-b870-4787-9a1c-758e9241e776\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.873673 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp" (OuterVolumeSpecName: "kube-api-access-f8gnp") pod "8431139c-b870-4787-9a1c-758e9241e776" (UID: "8431139c-b870-4787-9a1c-758e9241e776"). InnerVolumeSpecName "kube-api-access-f8gnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.896281 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8431139c-b870-4787-9a1c-758e9241e776" (UID: "8431139c-b870-4787-9a1c-758e9241e776"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.900204 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data" (OuterVolumeSpecName: "config-data") pod "8431139c-b870-4787-9a1c-758e9241e776" (UID: "8431139c-b870-4787-9a1c-758e9241e776"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.970493 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.970527 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8gnp\" (UniqueName: \"kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.970536 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.705106 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.738380 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.761561 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.779508 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:56 crc kubenswrapper[4775]: E0127 11:41:56.779994 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8431139c-b870-4787-9a1c-758e9241e776" containerName="nova-scheduler-scheduler" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.780008 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8431139c-b870-4787-9a1c-758e9241e776" containerName="nova-scheduler-scheduler" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.780211 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8431139c-b870-4787-9a1c-758e9241e776" containerName="nova-scheduler-scheduler" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.780862 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.783872 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.784830 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.887193 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgnz7\" (UniqueName: \"kubernetes.io/projected/a4732753-3f10-4604-89d0-0c074829e53f-kube-api-access-fgnz7\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.887402 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-config-data\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.887478 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.989737 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-config-data\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.989967 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.990107 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgnz7\" (UniqueName: \"kubernetes.io/projected/a4732753-3f10-4604-89d0-0c074829e53f-kube-api-access-fgnz7\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.998939 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-config-data\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.001981 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.007619 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgnz7\" (UniqueName: \"kubernetes.io/projected/a4732753-3f10-4604-89d0-0c074829e53f-kube-api-access-fgnz7\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.100749 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.542277 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:57 crc kubenswrapper[4775]: W0127 11:41:57.545799 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4732753_3f10_4604_89d0_0c074829e53f.slice/crio-384ca7b9894a211662ae86a4d5a5ebd3b58f6e596efef5d3d77e7c64ff1f109e WatchSource:0}: Error finding container 384ca7b9894a211662ae86a4d5a5ebd3b58f6e596efef5d3d77e7c64ff1f109e: Status 404 returned error can't find the container with id 384ca7b9894a211662ae86a4d5a5ebd3b58f6e596efef5d3d77e7c64ff1f109e Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.715334 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a4732753-3f10-4604-89d0-0c074829e53f","Type":"ContainerStarted","Data":"384ca7b9894a211662ae86a4d5a5ebd3b58f6e596efef5d3d77e7c64ff1f109e"} Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.779324 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8431139c-b870-4787-9a1c-758e9241e776" path="/var/lib/kubelet/pods/8431139c-b870-4787-9a1c-758e9241e776/volumes" Jan 27 11:41:58 crc kubenswrapper[4775]: I0127 11:41:58.727086 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a4732753-3f10-4604-89d0-0c074829e53f","Type":"ContainerStarted","Data":"0528b6a09ded0f641aee83b5f822082966ea860c25272d3498d9d2637382a76c"} Jan 27 11:41:58 crc kubenswrapper[4775]: I0127 11:41:58.752843 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.752825679 podStartE2EDuration="2.752825679s" podCreationTimestamp="2026-01-27 11:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:58.749684854 +0000 UTC m=+1297.891282641" watchObservedRunningTime="2026-01-27 11:41:58.752825679 +0000 UTC m=+1297.894423456" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.182289 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.269790 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.270069 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="dnsmasq-dns" containerID="cri-o://cceb38c9f507e6c4fd34c4cca53a771be807a04a895235a4301c6341b1fac77c" gracePeriod=10 Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.413391 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-knrgp"] Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.414885 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.431264 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-knrgp"] Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.538866 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.538931 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.538978 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9mnb\" (UniqueName: \"kubernetes.io/projected/f6c54a70-a562-4fef-b3fe-14e2a3029229-kube-api-access-b9mnb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.539128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.539288 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-config\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.539328 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.539485 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641436 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-config\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641627 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641739 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9mnb\" (UniqueName: \"kubernetes.io/projected/f6c54a70-a562-4fef-b3fe-14e2a3029229-kube-api-access-b9mnb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641994 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.642400 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-config\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.643100 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.643159 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.644144 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.644400 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.644569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.671537 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9mnb\" (UniqueName: \"kubernetes.io/projected/f6c54a70-a562-4fef-b3fe-14e2a3029229-kube-api-access-b9mnb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.741234 4775 generic.go:334] "Generic (PLEG): container finished" podID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerID="cceb38c9f507e6c4fd34c4cca53a771be807a04a895235a4301c6341b1fac77c" exitCode=0 Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.742151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" event={"ID":"160a0f00-a19e-4522-b8ea-2a14f87906e9","Type":"ContainerDied","Data":"cceb38c9f507e6c4fd34c4cca53a771be807a04a895235a4301c6341b1fac77c"} Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.742243 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" event={"ID":"160a0f00-a19e-4522-b8ea-2a14f87906e9","Type":"ContainerDied","Data":"a34cf5c231353408ee47634ef10ee450bdbb3cc3b1d50b38665b4fa21e3b0692"} Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.742257 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a34cf5c231353408ee47634ef10ee450bdbb3cc3b1d50b38665b4fa21e3b0692" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.743570 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.868651 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946357 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946433 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946508 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946699 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brv5f\" (UniqueName: \"kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946758 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946776 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.951325 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f" (OuterVolumeSpecName: "kube-api-access-brv5f") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "kube-api-access-brv5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.006393 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.006412 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.011261 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.023006 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config" (OuterVolumeSpecName: "config") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.034723 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.048848 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.049174 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.049184 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.049193 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brv5f\" (UniqueName: \"kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.049203 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.049215 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.242315 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-knrgp"] Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.750836 4775 generic.go:334] "Generic (PLEG): container finished" podID="f6c54a70-a562-4fef-b3fe-14e2a3029229" containerID="297bbe2bfb1cf7fdd07c7efe8142ca1d447763430eb9dc9194da892803da260f" exitCode=0 Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.750907 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" event={"ID":"f6c54a70-a562-4fef-b3fe-14e2a3029229","Type":"ContainerDied","Data":"297bbe2bfb1cf7fdd07c7efe8142ca1d447763430eb9dc9194da892803da260f"} Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.750938 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" event={"ID":"f6c54a70-a562-4fef-b3fe-14e2a3029229","Type":"ContainerStarted","Data":"4ba7f598c87a6deb7e40d06df6db66190f3761dae9c7e9b9f7699468d7620492"} Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.750942 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.946801 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.955212 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:42:01 crc kubenswrapper[4775]: I0127 11:42:01.758318 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" path="/var/lib/kubelet/pods/160a0f00-a19e-4522-b8ea-2a14f87906e9/volumes" Jan 27 11:42:01 crc kubenswrapper[4775]: I0127 11:42:01.771725 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" event={"ID":"f6c54a70-a562-4fef-b3fe-14e2a3029229","Type":"ContainerStarted","Data":"b359ce9b935c02b0ebbd3c738ede05c444329836ac0b725930307af83477dab3"} Jan 27 11:42:01 crc kubenswrapper[4775]: I0127 11:42:01.773079 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:42:01 crc kubenswrapper[4775]: I0127 11:42:01.796580 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" podStartSLOduration=2.796561309 podStartE2EDuration="2.796561309s" podCreationTimestamp="2026-01-27 11:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:42:01.794188344 +0000 UTC m=+1300.935786131" watchObservedRunningTime="2026-01-27 11:42:01.796561309 +0000 UTC m=+1300.938159086" Jan 27 11:42:02 crc kubenswrapper[4775]: I0127 11:42:02.101332 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 11:42:07 crc kubenswrapper[4775]: I0127 11:42:07.100914 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 11:42:07 crc kubenswrapper[4775]: I0127 11:42:07.128690 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 11:42:07 crc kubenswrapper[4775]: I0127 11:42:07.851244 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 11:42:09 crc kubenswrapper[4775]: I0127 11:42:09.759642 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:42:09 crc kubenswrapper[4775]: I0127 11:42:09.841923 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:42:09 crc kubenswrapper[4775]: I0127 11:42:09.842600 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="dnsmasq-dns" containerID="cri-o://7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7" gracePeriod=10 Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.302570 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.446771 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.446898 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.447028 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.447110 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.447203 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.447279 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.447389 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp58g\" (UniqueName: \"kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.452775 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g" (OuterVolumeSpecName: "kube-api-access-rp58g") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "kube-api-access-rp58g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.507540 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.508806 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.509892 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.511079 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.511673 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config" (OuterVolumeSpecName: "config") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.515326 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549527 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549570 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549583 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549594 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549603 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549613 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp58g\" (UniqueName: \"kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549623 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.858311 4775 generic.go:334] "Generic (PLEG): container finished" podID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerID="7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7" exitCode=0 Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.858354 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" event={"ID":"b5dacfeb-690d-4289-ae2d-0123e4435d4a","Type":"ContainerDied","Data":"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7"} Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.858379 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" event={"ID":"b5dacfeb-690d-4289-ae2d-0123e4435d4a","Type":"ContainerDied","Data":"df8f05a4df2923539bd706608b950d7b669b477c0ab0a1ec9d75d5d196841bbe"} Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.858383 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.858400 4775 scope.go:117] "RemoveContainer" containerID="7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.883588 4775 scope.go:117] "RemoveContainer" containerID="65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.895472 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.906529 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.911878 4775 scope.go:117] "RemoveContainer" containerID="7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7" Jan 27 11:42:10 crc kubenswrapper[4775]: E0127 11:42:10.912241 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7\": container with ID starting with 7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7 not found: ID does not exist" containerID="7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.912293 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7"} err="failed to get container status \"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7\": rpc error: code = NotFound desc = could not find container \"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7\": container with ID starting with 7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7 not found: ID does not exist" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.912328 4775 scope.go:117] "RemoveContainer" containerID="65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1" Jan 27 11:42:10 crc kubenswrapper[4775]: E0127 11:42:10.912655 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1\": container with ID starting with 65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1 not found: ID does not exist" containerID="65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.912707 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1"} err="failed to get container status \"65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1\": rpc error: code = NotFound desc = could not find container \"65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1\": container with ID starting with 65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1 not found: ID does not exist" Jan 27 11:42:11 crc kubenswrapper[4775]: I0127 11:42:11.756668 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" path="/var/lib/kubelet/pods/b5dacfeb-690d-4289-ae2d-0123e4435d4a/volumes" Jan 27 11:42:15 crc kubenswrapper[4775]: I0127 11:42:15.915346 4775 generic.go:334] "Generic (PLEG): container finished" podID="6c46c48a-ba77-4494-bc4e-f463a4072952" containerID="36fdf46333226ef36a60bae5ba2567a2bed7c60248a3525289f10e463659609d" exitCode=0 Jan 27 11:42:15 crc kubenswrapper[4775]: I0127 11:42:15.915438 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6c46c48a-ba77-4494-bc4e-f463a4072952","Type":"ContainerDied","Data":"36fdf46333226ef36a60bae5ba2567a2bed7c60248a3525289f10e463659609d"} Jan 27 11:42:16 crc kubenswrapper[4775]: I0127 11:42:16.932530 4775 generic.go:334] "Generic (PLEG): container finished" podID="bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d" containerID="81eaee1031536e651f818c97c18baa543f83a2f9dd9e2588f54dca81587b369b" exitCode=0 Jan 27 11:42:16 crc kubenswrapper[4775]: I0127 11:42:16.932612 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d","Type":"ContainerDied","Data":"81eaee1031536e651f818c97c18baa543f83a2f9dd9e2588f54dca81587b369b"} Jan 27 11:42:16 crc kubenswrapper[4775]: I0127 11:42:16.936687 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6c46c48a-ba77-4494-bc4e-f463a4072952","Type":"ContainerStarted","Data":"5dd7f324981da4ba980d85e5abd1f53c99069809aa32f8868e097280dd75cdcf"} Jan 27 11:42:16 crc kubenswrapper[4775]: I0127 11:42:16.937223 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 27 11:42:17 crc kubenswrapper[4775]: I0127 11:42:17.946431 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d","Type":"ContainerStarted","Data":"dc35a311d40871e62d3663070ac78fd8947b5196bdda1cda6ea77c0a3b003d3a"} Jan 27 11:42:17 crc kubenswrapper[4775]: I0127 11:42:17.946953 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:42:17 crc kubenswrapper[4775]: I0127 11:42:17.970836 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.970813584 podStartE2EDuration="37.970813584s" podCreationTimestamp="2026-01-27 11:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:42:16.991972589 +0000 UTC m=+1316.133570386" watchObservedRunningTime="2026-01-27 11:42:17.970813584 +0000 UTC m=+1317.112411361" Jan 27 11:42:17 crc kubenswrapper[4775]: I0127 11:42:17.978106 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.978089924 podStartE2EDuration="37.978089924s" podCreationTimestamp="2026-01-27 11:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:42:17.970933008 +0000 UTC m=+1317.112530795" watchObservedRunningTime="2026-01-27 11:42:17.978089924 +0000 UTC m=+1317.119687691" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.835653 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm"] Jan 27 11:42:22 crc kubenswrapper[4775]: E0127 11:42:22.836714 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="init" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.836735 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="init" Jan 27 11:42:22 crc kubenswrapper[4775]: E0127 11:42:22.836749 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.836757 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: E0127 11:42:22.836773 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.836781 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: E0127 11:42:22.836809 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="init" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.836817 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="init" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.837031 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.837071 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.838001 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.841944 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.842391 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.842601 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.843185 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.860677 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm"] Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.008397 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.008589 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2lkz\" (UniqueName: \"kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.008612 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.008642 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.110341 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.110519 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.110614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lkz\" (UniqueName: \"kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.110639 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.115930 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.116496 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.116869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.129967 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2lkz\" (UniqueName: \"kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.221542 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.822614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm"] Jan 27 11:42:24 crc kubenswrapper[4775]: I0127 11:42:24.010486 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" event={"ID":"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0","Type":"ContainerStarted","Data":"c4e1ba8ab6414980818b6b9dd471ceedd3ec1e881e2746d87e08a9e20b38b722"} Jan 27 11:42:29 crc kubenswrapper[4775]: I0127 11:42:29.517872 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:42:29 crc kubenswrapper[4775]: I0127 11:42:29.518534 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:42:31 crc kubenswrapper[4775]: I0127 11:42:31.053207 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 27 11:42:31 crc kubenswrapper[4775]: I0127 11:42:31.228668 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:42:34 crc kubenswrapper[4775]: I0127 11:42:34.112645 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" event={"ID":"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0","Type":"ContainerStarted","Data":"4421d990b66ed1807f062cea9113ca31d1aee8e79868e64188e9b6567b378eb2"} Jan 27 11:42:39 crc kubenswrapper[4775]: I0127 11:42:39.147425 4775 scope.go:117] "RemoveContainer" containerID="8350f8998d5c2b4d38b2c37a8ef1d6f2931c0920b4400f0d9585d7221601d93d" Jan 27 11:42:39 crc kubenswrapper[4775]: I0127 11:42:39.170224 4775 scope.go:117] "RemoveContainer" containerID="ab7d80585c73c2935a1546f42ec8127d8f07e4ebfcf89fc16e590bf9f313fdc3" Jan 27 11:42:39 crc kubenswrapper[4775]: I0127 11:42:39.213793 4775 scope.go:117] "RemoveContainer" containerID="97eb2ae0d47bf6851995b105d37a65888384ea986fa2a3b3f741906dd431a2f6" Jan 27 11:42:46 crc kubenswrapper[4775]: I0127 11:42:46.235780 4775 generic.go:334] "Generic (PLEG): container finished" podID="ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" containerID="4421d990b66ed1807f062cea9113ca31d1aee8e79868e64188e9b6567b378eb2" exitCode=0 Jan 27 11:42:46 crc kubenswrapper[4775]: I0127 11:42:46.235912 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" event={"ID":"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0","Type":"ContainerDied","Data":"4421d990b66ed1807f062cea9113ca31d1aee8e79868e64188e9b6567b378eb2"} Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.802085 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.991181 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2lkz\" (UniqueName: \"kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz\") pod \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.991243 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam\") pod \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.991269 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle\") pod \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.991307 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory\") pod \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.998610 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz" (OuterVolumeSpecName: "kube-api-access-z2lkz") pod "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" (UID: "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0"). InnerVolumeSpecName "kube-api-access-z2lkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.998651 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" (UID: "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.020761 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory" (OuterVolumeSpecName: "inventory") pod "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" (UID: "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.022518 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" (UID: "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.093856 4775 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.093899 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.093913 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2lkz\" (UniqueName: \"kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.093926 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.254711 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" event={"ID":"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0","Type":"ContainerDied","Data":"c4e1ba8ab6414980818b6b9dd471ceedd3ec1e881e2746d87e08a9e20b38b722"} Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.254758 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4e1ba8ab6414980818b6b9dd471ceedd3ec1e881e2746d87e08a9e20b38b722" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.254760 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.343496 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd"] Jan 27 11:42:48 crc kubenswrapper[4775]: E0127 11:42:48.344048 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.344071 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.344307 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.345074 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.347682 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.348149 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.350764 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.350830 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.358221 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd"] Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.501149 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49cms\" (UniqueName: \"kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.501338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.501444 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.602739 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.602838 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.602970 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49cms\" (UniqueName: \"kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.607671 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.609906 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.624247 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49cms\" (UniqueName: \"kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.667069 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:49 crc kubenswrapper[4775]: I0127 11:42:49.229423 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd"] Jan 27 11:42:49 crc kubenswrapper[4775]: I0127 11:42:49.268571 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" event={"ID":"e2226633-918b-423c-a329-bfd52943a1b0","Type":"ContainerStarted","Data":"4557cb8926c6adf3913c436efd184cd33deff2547d334fb69dcd315eba869c6b"} Jan 27 11:42:50 crc kubenswrapper[4775]: I0127 11:42:50.282193 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" event={"ID":"e2226633-918b-423c-a329-bfd52943a1b0","Type":"ContainerStarted","Data":"f21ddc8a8de4557b903d243bda6d4374bd676b4a1b43d17c511b753d4ca5bbb1"} Jan 27 11:42:50 crc kubenswrapper[4775]: I0127 11:42:50.310924 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" podStartSLOduration=1.786030725 podStartE2EDuration="2.310890528s" podCreationTimestamp="2026-01-27 11:42:48 +0000 UTC" firstStartedPulling="2026-01-27 11:42:49.227305316 +0000 UTC m=+1348.368903113" lastFinishedPulling="2026-01-27 11:42:49.752165089 +0000 UTC m=+1348.893762916" observedRunningTime="2026-01-27 11:42:50.303993849 +0000 UTC m=+1349.445591656" watchObservedRunningTime="2026-01-27 11:42:50.310890528 +0000 UTC m=+1349.452488315" Jan 27 11:42:53 crc kubenswrapper[4775]: I0127 11:42:53.310960 4775 generic.go:334] "Generic (PLEG): container finished" podID="e2226633-918b-423c-a329-bfd52943a1b0" containerID="f21ddc8a8de4557b903d243bda6d4374bd676b4a1b43d17c511b753d4ca5bbb1" exitCode=0 Jan 27 11:42:53 crc kubenswrapper[4775]: I0127 11:42:53.311050 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" event={"ID":"e2226633-918b-423c-a329-bfd52943a1b0","Type":"ContainerDied","Data":"f21ddc8a8de4557b903d243bda6d4374bd676b4a1b43d17c511b753d4ca5bbb1"} Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.759252 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.922851 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49cms\" (UniqueName: \"kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms\") pod \"e2226633-918b-423c-a329-bfd52943a1b0\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.923406 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory\") pod \"e2226633-918b-423c-a329-bfd52943a1b0\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.923511 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam\") pod \"e2226633-918b-423c-a329-bfd52943a1b0\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.930976 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms" (OuterVolumeSpecName: "kube-api-access-49cms") pod "e2226633-918b-423c-a329-bfd52943a1b0" (UID: "e2226633-918b-423c-a329-bfd52943a1b0"). InnerVolumeSpecName "kube-api-access-49cms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.954682 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e2226633-918b-423c-a329-bfd52943a1b0" (UID: "e2226633-918b-423c-a329-bfd52943a1b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.954904 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory" (OuterVolumeSpecName: "inventory") pod "e2226633-918b-423c-a329-bfd52943a1b0" (UID: "e2226633-918b-423c-a329-bfd52943a1b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.026346 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.026384 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49cms\" (UniqueName: \"kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.026402 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.334961 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" event={"ID":"e2226633-918b-423c-a329-bfd52943a1b0","Type":"ContainerDied","Data":"4557cb8926c6adf3913c436efd184cd33deff2547d334fb69dcd315eba869c6b"} Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.335022 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4557cb8926c6adf3913c436efd184cd33deff2547d334fb69dcd315eba869c6b" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.335087 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.414294 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw"] Jan 27 11:42:55 crc kubenswrapper[4775]: E0127 11:42:55.415016 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2226633-918b-423c-a329-bfd52943a1b0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.415045 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2226633-918b-423c-a329-bfd52943a1b0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.415294 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2226633-918b-423c-a329-bfd52943a1b0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.416095 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.418401 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.418537 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.419441 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.421970 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.427405 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw"] Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.536786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.536863 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.536894 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.537294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf254\" (UniqueName: \"kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.640064 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.640194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.640264 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.640420 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf254\" (UniqueName: \"kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.644364 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.645367 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.646374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.660422 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf254\" (UniqueName: \"kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.743323 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:56 crc kubenswrapper[4775]: I0127 11:42:56.465948 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw"] Jan 27 11:42:57 crc kubenswrapper[4775]: I0127 11:42:57.356839 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" event={"ID":"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4","Type":"ContainerStarted","Data":"f96b1a6dd88339ae4b48f43c6f6c0f5bb250530ab598f89eedb278600ec29d82"} Jan 27 11:42:57 crc kubenswrapper[4775]: I0127 11:42:57.357549 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" event={"ID":"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4","Type":"ContainerStarted","Data":"02aeb2413cb91178354fb34b5ec578f65317b693847156a1beffd5d7a10f9f91"} Jan 27 11:42:57 crc kubenswrapper[4775]: I0127 11:42:57.389774 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" podStartSLOduration=1.989230566 podStartE2EDuration="2.389746736s" podCreationTimestamp="2026-01-27 11:42:55 +0000 UTC" firstStartedPulling="2026-01-27 11:42:56.471675313 +0000 UTC m=+1355.613273090" lastFinishedPulling="2026-01-27 11:42:56.872191483 +0000 UTC m=+1356.013789260" observedRunningTime="2026-01-27 11:42:57.380140053 +0000 UTC m=+1356.521737890" watchObservedRunningTime="2026-01-27 11:42:57.389746736 +0000 UTC m=+1356.531344553" Jan 27 11:42:59 crc kubenswrapper[4775]: I0127 11:42:59.517789 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:42:59 crc kubenswrapper[4775]: I0127 11:42:59.518156 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.517353 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.517900 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.517960 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.518780 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.518847 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51" gracePeriod=600 Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.692370 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51" exitCode=0 Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.692476 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51"} Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.692794 4775 scope.go:117] "RemoveContainer" containerID="26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7" Jan 27 11:43:30 crc kubenswrapper[4775]: I0127 11:43:30.703371 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94"} Jan 27 11:43:39 crc kubenswrapper[4775]: I0127 11:43:39.316627 4775 scope.go:117] "RemoveContainer" containerID="ae1cd59633ddddab66ae211c50fdfac95f828c364b9df14796c53c76293906ec" Jan 27 11:43:39 crc kubenswrapper[4775]: I0127 11:43:39.364578 4775 scope.go:117] "RemoveContainer" containerID="4cde95c13e106ae0baf2b7a5b06242a46ab07d950f57252253895801adba497a" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.172685 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft"] Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.175487 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.181017 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.182693 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.183103 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft"] Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.317398 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.317754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74wv8\" (UniqueName: \"kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.317811 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.419813 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.419895 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74wv8\" (UniqueName: \"kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.419962 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.420903 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.426345 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.439984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74wv8\" (UniqueName: \"kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.555766 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.994512 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft"] Jan 27 11:45:01 crc kubenswrapper[4775]: I0127 11:45:01.670106 4775 generic.go:334] "Generic (PLEG): container finished" podID="3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" containerID="5c27e62a0c2d5741ef2ffd6f30031b9143ba1b815c078ea3acffebfbaf79467e" exitCode=0 Jan 27 11:45:01 crc kubenswrapper[4775]: I0127 11:45:01.670204 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" event={"ID":"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f","Type":"ContainerDied","Data":"5c27e62a0c2d5741ef2ffd6f30031b9143ba1b815c078ea3acffebfbaf79467e"} Jan 27 11:45:01 crc kubenswrapper[4775]: I0127 11:45:01.670507 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" event={"ID":"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f","Type":"ContainerStarted","Data":"a7fefa1620ee958e3d0d5848cd7261bd17b6011913e33e5a7718e96c1ea60245"} Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.002764 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.070290 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume\") pod \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.070363 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume\") pod \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.070389 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74wv8\" (UniqueName: \"kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8\") pod \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.071262 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume" (OuterVolumeSpecName: "config-volume") pod "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" (UID: "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.076785 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" (UID: "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.076867 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8" (OuterVolumeSpecName: "kube-api-access-74wv8") pod "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" (UID: "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f"). InnerVolumeSpecName "kube-api-access-74wv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.171964 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.171994 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.172003 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74wv8\" (UniqueName: \"kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8\") on node \"crc\" DevicePath \"\"" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.686384 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" event={"ID":"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f","Type":"ContainerDied","Data":"a7fefa1620ee958e3d0d5848cd7261bd17b6011913e33e5a7718e96c1ea60245"} Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.686753 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7fefa1620ee958e3d0d5848cd7261bd17b6011913e33e5a7718e96c1ea60245" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.686426 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:29 crc kubenswrapper[4775]: I0127 11:45:29.517937 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:45:29 crc kubenswrapper[4775]: I0127 11:45:29.518556 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.561105 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:45:49 crc kubenswrapper[4775]: E0127 11:45:49.563203 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" containerName="collect-profiles" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.563237 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" containerName="collect-profiles" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.563493 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" containerName="collect-profiles" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.564946 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.571638 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.622694 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.622977 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.623182 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh25k\" (UniqueName: \"kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.724625 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.724757 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh25k\" (UniqueName: \"kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.724803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.725103 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.725141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.743195 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh25k\" (UniqueName: \"kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.886123 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:50 crc kubenswrapper[4775]: I0127 11:45:50.325551 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:45:50 crc kubenswrapper[4775]: E0127 11:45:50.652356 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65bbbbe1_7f7d_439b_8a67_af6503dd0d59.slice/crio-conmon-36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:45:51 crc kubenswrapper[4775]: I0127 11:45:51.159627 4775 generic.go:334] "Generic (PLEG): container finished" podID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerID="36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c" exitCode=0 Jan 27 11:45:51 crc kubenswrapper[4775]: I0127 11:45:51.159738 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerDied","Data":"36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c"} Jan 27 11:45:51 crc kubenswrapper[4775]: I0127 11:45:51.160025 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerStarted","Data":"e657cefbc3aaf1706e7be3f7e18b9398cb99f6e7fa4bb2b7577af101a745ab39"} Jan 27 11:45:51 crc kubenswrapper[4775]: I0127 11:45:51.162048 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:45:53 crc kubenswrapper[4775]: I0127 11:45:53.177994 4775 generic.go:334] "Generic (PLEG): container finished" podID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerID="b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc" exitCode=0 Jan 27 11:45:53 crc kubenswrapper[4775]: I0127 11:45:53.178033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerDied","Data":"b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc"} Jan 27 11:45:54 crc kubenswrapper[4775]: I0127 11:45:54.188366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerStarted","Data":"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485"} Jan 27 11:45:54 crc kubenswrapper[4775]: I0127 11:45:54.216222 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j4jsp" podStartSLOduration=2.722288125 podStartE2EDuration="5.216200123s" podCreationTimestamp="2026-01-27 11:45:49 +0000 UTC" firstStartedPulling="2026-01-27 11:45:51.16159768 +0000 UTC m=+1530.303195457" lastFinishedPulling="2026-01-27 11:45:53.655509678 +0000 UTC m=+1532.797107455" observedRunningTime="2026-01-27 11:45:54.203553796 +0000 UTC m=+1533.345151583" watchObservedRunningTime="2026-01-27 11:45:54.216200123 +0000 UTC m=+1533.357797900" Jan 27 11:45:59 crc kubenswrapper[4775]: I0127 11:45:59.517437 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:45:59 crc kubenswrapper[4775]: I0127 11:45:59.518048 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:45:59 crc kubenswrapper[4775]: I0127 11:45:59.886922 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:59 crc kubenswrapper[4775]: I0127 11:45:59.887259 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:59 crc kubenswrapper[4775]: I0127 11:45:59.935523 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:46:00 crc kubenswrapper[4775]: I0127 11:46:00.282852 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:46:00 crc kubenswrapper[4775]: I0127 11:46:00.329250 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.252759 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j4jsp" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="registry-server" containerID="cri-o://1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485" gracePeriod=2 Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.684778 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.860592 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content\") pod \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.860746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities\") pod \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.860774 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh25k\" (UniqueName: \"kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k\") pod \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.862080 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities" (OuterVolumeSpecName: "utilities") pod "65bbbbe1-7f7d-439b-8a67-af6503dd0d59" (UID: "65bbbbe1-7f7d-439b-8a67-af6503dd0d59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.866863 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k" (OuterVolumeSpecName: "kube-api-access-zh25k") pod "65bbbbe1-7f7d-439b-8a67-af6503dd0d59" (UID: "65bbbbe1-7f7d-439b-8a67-af6503dd0d59"). InnerVolumeSpecName "kube-api-access-zh25k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.888680 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65bbbbe1-7f7d-439b-8a67-af6503dd0d59" (UID: "65bbbbe1-7f7d-439b-8a67-af6503dd0d59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.963335 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.963380 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh25k\" (UniqueName: \"kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.963394 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.262135 4775 generic.go:334] "Generic (PLEG): container finished" podID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerID="1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485" exitCode=0 Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.262194 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.262245 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerDied","Data":"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485"} Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.262635 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerDied","Data":"e657cefbc3aaf1706e7be3f7e18b9398cb99f6e7fa4bb2b7577af101a745ab39"} Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.262655 4775 scope.go:117] "RemoveContainer" containerID="1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.291400 4775 scope.go:117] "RemoveContainer" containerID="b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.292674 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.300410 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.324090 4775 scope.go:117] "RemoveContainer" containerID="36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.352829 4775 scope.go:117] "RemoveContainer" containerID="1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485" Jan 27 11:46:03 crc kubenswrapper[4775]: E0127 11:46:03.353296 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485\": container with ID starting with 1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485 not found: ID does not exist" containerID="1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.353351 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485"} err="failed to get container status \"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485\": rpc error: code = NotFound desc = could not find container \"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485\": container with ID starting with 1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485 not found: ID does not exist" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.353384 4775 scope.go:117] "RemoveContainer" containerID="b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc" Jan 27 11:46:03 crc kubenswrapper[4775]: E0127 11:46:03.353805 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc\": container with ID starting with b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc not found: ID does not exist" containerID="b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.353847 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc"} err="failed to get container status \"b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc\": rpc error: code = NotFound desc = could not find container \"b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc\": container with ID starting with b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc not found: ID does not exist" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.353875 4775 scope.go:117] "RemoveContainer" containerID="36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c" Jan 27 11:46:03 crc kubenswrapper[4775]: E0127 11:46:03.354297 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c\": container with ID starting with 36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c not found: ID does not exist" containerID="36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.354336 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c"} err="failed to get container status \"36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c\": rpc error: code = NotFound desc = could not find container \"36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c\": container with ID starting with 36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c not found: ID does not exist" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.755690 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" path="/var/lib/kubelet/pods/65bbbbe1-7f7d-439b-8a67-af6503dd0d59/volumes" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.957523 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:16 crc kubenswrapper[4775]: E0127 11:46:16.958668 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="extract-utilities" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.958688 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="extract-utilities" Jan 27 11:46:16 crc kubenswrapper[4775]: E0127 11:46:16.958703 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="registry-server" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.958713 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="registry-server" Jan 27 11:46:16 crc kubenswrapper[4775]: E0127 11:46:16.958738 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="extract-content" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.958930 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="extract-content" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.959180 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="registry-server" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.960920 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.972546 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.148892 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.149027 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.149086 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqv2p\" (UniqueName: \"kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.251044 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.251113 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.251147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqv2p\" (UniqueName: \"kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.251644 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.251681 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.274079 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqv2p\" (UniqueName: \"kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.284985 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.810141 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:17 crc kubenswrapper[4775]: W0127 11:46:17.818305 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01280896_28bf_48e8_82b4_a28e65351bf8.slice/crio-5be5cf334b570a79de7b28aa19fe919a8ae19f7feff7226b4144828d643a2611 WatchSource:0}: Error finding container 5be5cf334b570a79de7b28aa19fe919a8ae19f7feff7226b4144828d643a2611: Status 404 returned error can't find the container with id 5be5cf334b570a79de7b28aa19fe919a8ae19f7feff7226b4144828d643a2611 Jan 27 11:46:18 crc kubenswrapper[4775]: I0127 11:46:18.405395 4775 generic.go:334] "Generic (PLEG): container finished" podID="01280896-28bf-48e8-82b4-a28e65351bf8" containerID="4d1fd7a3a7b7e1dd9235db1eb044fef085283a18452fdab7e8dd5a79d836ec7a" exitCode=0 Jan 27 11:46:18 crc kubenswrapper[4775]: I0127 11:46:18.405469 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerDied","Data":"4d1fd7a3a7b7e1dd9235db1eb044fef085283a18452fdab7e8dd5a79d836ec7a"} Jan 27 11:46:18 crc kubenswrapper[4775]: I0127 11:46:18.405516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerStarted","Data":"5be5cf334b570a79de7b28aa19fe919a8ae19f7feff7226b4144828d643a2611"} Jan 27 11:46:20 crc kubenswrapper[4775]: I0127 11:46:20.422675 4775 generic.go:334] "Generic (PLEG): container finished" podID="01280896-28bf-48e8-82b4-a28e65351bf8" containerID="00b37ab51003ac4db2057e78eaff4936b8c7c44607d57e697ef8ff6a716893b0" exitCode=0 Jan 27 11:46:20 crc kubenswrapper[4775]: I0127 11:46:20.422868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerDied","Data":"00b37ab51003ac4db2057e78eaff4936b8c7c44607d57e697ef8ff6a716893b0"} Jan 27 11:46:24 crc kubenswrapper[4775]: I0127 11:46:24.458558 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" containerID="f96b1a6dd88339ae4b48f43c6f6c0f5bb250530ab598f89eedb278600ec29d82" exitCode=0 Jan 27 11:46:24 crc kubenswrapper[4775]: I0127 11:46:24.458650 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" event={"ID":"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4","Type":"ContainerDied","Data":"f96b1a6dd88339ae4b48f43c6f6c0f5bb250530ab598f89eedb278600ec29d82"} Jan 27 11:46:24 crc kubenswrapper[4775]: I0127 11:46:24.461830 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerStarted","Data":"b1703fe518e4f131b25d2b70f4085458d406a55f1a26b53ec19be385abe3ad31"} Jan 27 11:46:24 crc kubenswrapper[4775]: I0127 11:46:24.500462 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fk7jm" podStartSLOduration=3.087373317 podStartE2EDuration="8.500430455s" podCreationTimestamp="2026-01-27 11:46:16 +0000 UTC" firstStartedPulling="2026-01-27 11:46:18.407838712 +0000 UTC m=+1557.549436489" lastFinishedPulling="2026-01-27 11:46:23.82089582 +0000 UTC m=+1562.962493627" observedRunningTime="2026-01-27 11:46:24.49550416 +0000 UTC m=+1563.637101947" watchObservedRunningTime="2026-01-27 11:46:24.500430455 +0000 UTC m=+1563.642028232" Jan 27 11:46:25 crc kubenswrapper[4775]: I0127 11:46:25.868258 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.033875 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-z98pk"] Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.042318 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory\") pod \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.042426 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle\") pod \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.042487 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf254\" (UniqueName: \"kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254\") pod \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.042548 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam\") pod \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.044756 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-z98pk"] Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.049799 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" (UID: "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.049811 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254" (OuterVolumeSpecName: "kube-api-access-wf254") pod "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" (UID: "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4"). InnerVolumeSpecName "kube-api-access-wf254". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.072898 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory" (OuterVolumeSpecName: "inventory") pod "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" (UID: "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.076756 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" (UID: "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.144246 4775 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.144278 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf254\" (UniqueName: \"kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.144290 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.144305 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.476740 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" event={"ID":"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4","Type":"ContainerDied","Data":"02aeb2413cb91178354fb34b5ec578f65317b693847156a1beffd5d7a10f9f91"} Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.476781 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02aeb2413cb91178354fb34b5ec578f65317b693847156a1beffd5d7a10f9f91" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.476844 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.565416 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh"] Jan 27 11:46:26 crc kubenswrapper[4775]: E0127 11:46:26.566118 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.566211 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.566526 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.567389 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.571768 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.571861 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.571901 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.571932 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.586778 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh"] Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.755561 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfdjz\" (UniqueName: \"kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.755865 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.755938 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.857242 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.857383 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfdjz\" (UniqueName: \"kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.858150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.861820 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.862025 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.873351 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfdjz\" (UniqueName: \"kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.893281 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.286074 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.286383 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.334247 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.455334 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh"] Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.502314 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" event={"ID":"e018489b-9445-4afb-8e4c-e9d52a6781d7","Type":"ContainerStarted","Data":"3217b89f2dac713472e7b3a18905a4d7a31e80c6ce8d152dbe6348dc51a98d1a"} Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.753903 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53ed1d7-9aa1-49d4-8396-c3487e0465d6" path="/var/lib/kubelet/pods/f53ed1d7-9aa1-49d4-8396-c3487e0465d6/volumes" Jan 27 11:46:28 crc kubenswrapper[4775]: I0127 11:46:28.511292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" event={"ID":"e018489b-9445-4afb-8e4c-e9d52a6781d7","Type":"ContainerStarted","Data":"68fab69969ac252051443544579383fd831d8133c32cad9d9c4c67e6e0fe0911"} Jan 27 11:46:28 crc kubenswrapper[4775]: I0127 11:46:28.535071 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" podStartSLOduration=1.835598302 podStartE2EDuration="2.535046333s" podCreationTimestamp="2026-01-27 11:46:26 +0000 UTC" firstStartedPulling="2026-01-27 11:46:27.467603772 +0000 UTC m=+1566.609201549" lastFinishedPulling="2026-01-27 11:46:28.167051803 +0000 UTC m=+1567.308649580" observedRunningTime="2026-01-27 11:46:28.526433307 +0000 UTC m=+1567.668031104" watchObservedRunningTime="2026-01-27 11:46:28.535046333 +0000 UTC m=+1567.676644130" Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.041704 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m5645"] Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.050991 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m5645"] Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.517721 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.517778 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.517818 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.518546 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.518597 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" gracePeriod=600 Jan 27 11:46:29 crc kubenswrapper[4775]: E0127 11:46:29.659952 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.755015 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" path="/var/lib/kubelet/pods/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d/volumes" Jan 27 11:46:30 crc kubenswrapper[4775]: I0127 11:46:30.530979 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" exitCode=0 Jan 27 11:46:30 crc kubenswrapper[4775]: I0127 11:46:30.531048 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94"} Jan 27 11:46:30 crc kubenswrapper[4775]: I0127 11:46:30.531347 4775 scope.go:117] "RemoveContainer" containerID="cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51" Jan 27 11:46:30 crc kubenswrapper[4775]: I0127 11:46:30.532082 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:46:30 crc kubenswrapper[4775]: E0127 11:46:30.532376 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.030761 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2856-account-create-update-zgmqw"] Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.040580 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-599fs"] Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.051032 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-599fs"] Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.059789 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2856-account-create-update-zgmqw"] Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.759023 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bbde61d-aca8-4b36-8896-9c0db3e081be" path="/var/lib/kubelet/pods/0bbde61d-aca8-4b36-8896-9c0db3e081be/volumes" Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.760393 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3cd1d9e-b735-4f90-b92a-00353e576e10" path="/var/lib/kubelet/pods/c3cd1d9e-b735-4f90-b92a-00353e576e10/volumes" Jan 27 11:46:34 crc kubenswrapper[4775]: I0127 11:46:34.034248 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8d1f-account-create-update-gbh56"] Jan 27 11:46:34 crc kubenswrapper[4775]: I0127 11:46:34.044358 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9763-account-create-update-dms9b"] Jan 27 11:46:34 crc kubenswrapper[4775]: I0127 11:46:34.053385 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8d1f-account-create-update-gbh56"] Jan 27 11:46:34 crc kubenswrapper[4775]: I0127 11:46:34.066612 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9763-account-create-update-dms9b"] Jan 27 11:46:35 crc kubenswrapper[4775]: I0127 11:46:35.033377 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vvxg4"] Jan 27 11:46:35 crc kubenswrapper[4775]: I0127 11:46:35.041164 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vvxg4"] Jan 27 11:46:35 crc kubenswrapper[4775]: I0127 11:46:35.759714 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d04bb6-3007-42c5-9753-746a6eeb7d1c" path="/var/lib/kubelet/pods/24d04bb6-3007-42c5-9753-746a6eeb7d1c/volumes" Jan 27 11:46:35 crc kubenswrapper[4775]: I0127 11:46:35.760949 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f208e1de-fc0e-4deb-a093-d27604b3931f" path="/var/lib/kubelet/pods/f208e1de-fc0e-4deb-a093-d27604b3931f/volumes" Jan 27 11:46:35 crc kubenswrapper[4775]: I0127 11:46:35.762298 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f577e755-a863-4fea-9288-6cd30168b405" path="/var/lib/kubelet/pods/f577e755-a863-4fea-9288-6cd30168b405/volumes" Jan 27 11:46:37 crc kubenswrapper[4775]: I0127 11:46:37.343440 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:37 crc kubenswrapper[4775]: I0127 11:46:37.387093 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:37 crc kubenswrapper[4775]: I0127 11:46:37.620840 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fk7jm" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="registry-server" containerID="cri-o://b1703fe518e4f131b25d2b70f4085458d406a55f1a26b53ec19be385abe3ad31" gracePeriod=2 Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.634234 4775 generic.go:334] "Generic (PLEG): container finished" podID="01280896-28bf-48e8-82b4-a28e65351bf8" containerID="b1703fe518e4f131b25d2b70f4085458d406a55f1a26b53ec19be385abe3ad31" exitCode=0 Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.634306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerDied","Data":"b1703fe518e4f131b25d2b70f4085458d406a55f1a26b53ec19be385abe3ad31"} Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.801272 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.978676 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content\") pod \"01280896-28bf-48e8-82b4-a28e65351bf8\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.979367 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqv2p\" (UniqueName: \"kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p\") pod \"01280896-28bf-48e8-82b4-a28e65351bf8\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.980882 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities\") pod \"01280896-28bf-48e8-82b4-a28e65351bf8\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.981824 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities" (OuterVolumeSpecName: "utilities") pod "01280896-28bf-48e8-82b4-a28e65351bf8" (UID: "01280896-28bf-48e8-82b4-a28e65351bf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.986573 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p" (OuterVolumeSpecName: "kube-api-access-fqv2p") pod "01280896-28bf-48e8-82b4-a28e65351bf8" (UID: "01280896-28bf-48e8-82b4-a28e65351bf8"). InnerVolumeSpecName "kube-api-access-fqv2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.028230 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01280896-28bf-48e8-82b4-a28e65351bf8" (UID: "01280896-28bf-48e8-82b4-a28e65351bf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.084005 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqv2p\" (UniqueName: \"kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.084059 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.084079 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.533883 4775 scope.go:117] "RemoveContainer" containerID="7b4d6f31c9c98ba053d3d16dc4c80a54a02b6f5c6992d3e72b61e7cfc30b58ab" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.571579 4775 scope.go:117] "RemoveContainer" containerID="8afc04127ae5dac867cf7f5463a37db08396e7d83dca005132a5f83a2ea9896d" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.643756 4775 scope.go:117] "RemoveContainer" containerID="5f66195a27d4424e7e63c73f2e82e91d3646c082443a037a0bda03b3cefa73cf" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.653222 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerDied","Data":"5be5cf334b570a79de7b28aa19fe919a8ae19f7feff7226b4144828d643a2611"} Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.653292 4775 scope.go:117] "RemoveContainer" containerID="b1703fe518e4f131b25d2b70f4085458d406a55f1a26b53ec19be385abe3ad31" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.653515 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.671420 4775 scope.go:117] "RemoveContainer" containerID="0a7460a95945a93f0c4a50f297f4b7fe68e0f3ea9e0d32b93ec9b5db49741c68" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.693769 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.701663 4775 scope.go:117] "RemoveContainer" containerID="00b37ab51003ac4db2057e78eaff4936b8c7c44607d57e697ef8ff6a716893b0" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.702124 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.720875 4775 scope.go:117] "RemoveContainer" containerID="b680860e2593d7ee3bb455ce65bb0c417d6d9c265106d69c11a3f6d5c337e06f" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.754813 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" path="/var/lib/kubelet/pods/01280896-28bf-48e8-82b4-a28e65351bf8/volumes" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.769789 4775 scope.go:117] "RemoveContainer" containerID="4d1fd7a3a7b7e1dd9235db1eb044fef085283a18452fdab7e8dd5a79d836ec7a" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.791689 4775 scope.go:117] "RemoveContainer" containerID="a7104b478c78a88190582a427d9e420a454c991055e729bc5832a8bcf5f244d9" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.843694 4775 scope.go:117] "RemoveContainer" containerID="86a1bff7b31394585d429293e2cf406a868ddfdf2d92e362c2ef607e10a9665a" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.864529 4775 scope.go:117] "RemoveContainer" containerID="ada66549c4f1e296080bb921b685b5ff52027670033c232a5715f71a31d45760" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.880561 4775 scope.go:117] "RemoveContainer" containerID="f260a904e6d20da11c12e2ef276cb0dd004088b3878643538e823bf35507b886" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.902204 4775 scope.go:117] "RemoveContainer" containerID="b726600d4c126579c1604f5195dde261fec3e367b813eba5f4b69473ff9e521c" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.940409 4775 scope.go:117] "RemoveContainer" containerID="cceb38c9f507e6c4fd34c4cca53a771be807a04a895235a4301c6341b1fac77c" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.956905 4775 scope.go:117] "RemoveContainer" containerID="1b501489d56c612c1213704c15f0b24ba5a096453c8a67466274eb0e4a0ced9d" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.979761 4775 scope.go:117] "RemoveContainer" containerID="3b69c86674facf450b3f60f67ef713811fbc5e3c9c84c0321b56c4b870189985" Jan 27 11:46:40 crc kubenswrapper[4775]: I0127 11:46:40.017994 4775 scope.go:117] "RemoveContainer" containerID="25331384137e51f62cf5d50c569a969c7570079d48885c44122b0593afae0e9e" Jan 27 11:46:40 crc kubenswrapper[4775]: I0127 11:46:40.102685 4775 scope.go:117] "RemoveContainer" containerID="c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" Jan 27 11:46:45 crc kubenswrapper[4775]: I0127 11:46:45.744960 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:46:45 crc kubenswrapper[4775]: E0127 11:46:45.745543 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:46:58 crc kubenswrapper[4775]: I0127 11:46:58.745297 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:46:58 crc kubenswrapper[4775]: E0127 11:46:58.746191 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.055274 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fcvx2"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.066598 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fcvx2"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.077865 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-62xpg"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.090030 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-x8mb5"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.098349 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-62xpg"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.105765 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a920-account-create-update-7gdg6"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.114834 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b4bd-account-create-update-lztz8"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.124633 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b21b-account-create-update-grvbp"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.136127 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a920-account-create-update-7gdg6"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.148734 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b4bd-account-create-update-lztz8"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.158362 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-x8mb5"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.168143 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b21b-account-create-update-grvbp"] Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.757243 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066d45f9-5f72-4b81-8166-0238863b8789" path="/var/lib/kubelet/pods/066d45f9-5f72-4b81-8166-0238863b8789/volumes" Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.758473 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a046ea-e8eb-40ed-a64d-b382e0a2f331" path="/var/lib/kubelet/pods/58a046ea-e8eb-40ed-a64d-b382e0a2f331/volumes" Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.759266 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90455f95-bcc6-4229-948c-599c91a08b2a" path="/var/lib/kubelet/pods/90455f95-bcc6-4229-948c-599c91a08b2a/volumes" Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.759996 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" path="/var/lib/kubelet/pods/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2/volumes" Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.761516 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c495d390-f7ca-4867-b334-263c03f6b211" path="/var/lib/kubelet/pods/c495d390-f7ca-4867-b334-263c03f6b211/volumes" Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.762289 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" path="/var/lib/kubelet/pods/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07/volumes" Jan 27 11:47:05 crc kubenswrapper[4775]: I0127 11:47:05.035211 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kc6bw"] Jan 27 11:47:05 crc kubenswrapper[4775]: I0127 11:47:05.043224 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kc6bw"] Jan 27 11:47:05 crc kubenswrapper[4775]: I0127 11:47:05.755370 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71de6180-54da-4c3b-8aea-73a2ccfd936a" path="/var/lib/kubelet/pods/71de6180-54da-4c3b-8aea-73a2ccfd936a/volumes" Jan 27 11:47:13 crc kubenswrapper[4775]: I0127 11:47:13.745019 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:47:13 crc kubenswrapper[4775]: E0127 11:47:13.746000 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:47:24 crc kubenswrapper[4775]: I0127 11:47:24.745581 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:47:24 crc kubenswrapper[4775]: E0127 11:47:24.746615 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.555734 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:25 crc kubenswrapper[4775]: E0127 11:47:25.556213 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="extract-content" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.556234 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="extract-content" Jan 27 11:47:25 crc kubenswrapper[4775]: E0127 11:47:25.556250 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="extract-utilities" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.556258 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="extract-utilities" Jan 27 11:47:25 crc kubenswrapper[4775]: E0127 11:47:25.556273 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="registry-server" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.556281 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="registry-server" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.556519 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="registry-server" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.557915 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.566343 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.582900 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.582952 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8ldx\" (UniqueName: \"kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.583019 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.684419 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.684491 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8ldx\" (UniqueName: \"kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.684530 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.685035 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.685072 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.702097 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8ldx\" (UniqueName: \"kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.883073 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:26 crc kubenswrapper[4775]: I0127 11:47:26.353663 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:27 crc kubenswrapper[4775]: I0127 11:47:27.115149 4775 generic.go:334] "Generic (PLEG): container finished" podID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerID="d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0" exitCode=0 Jan 27 11:47:27 crc kubenswrapper[4775]: I0127 11:47:27.115230 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerDied","Data":"d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0"} Jan 27 11:47:27 crc kubenswrapper[4775]: I0127 11:47:27.115561 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerStarted","Data":"c86625840bf41ea9181153767b3d3a82a3b86875abf42fbf6fc07a9e94beac5b"} Jan 27 11:47:28 crc kubenswrapper[4775]: I0127 11:47:28.124833 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerStarted","Data":"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4"} Jan 27 11:47:31 crc kubenswrapper[4775]: I0127 11:47:31.163535 4775 generic.go:334] "Generic (PLEG): container finished" podID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerID="533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4" exitCode=0 Jan 27 11:47:31 crc kubenswrapper[4775]: I0127 11:47:31.163633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerDied","Data":"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4"} Jan 27 11:47:32 crc kubenswrapper[4775]: I0127 11:47:32.175510 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerStarted","Data":"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1"} Jan 27 11:47:32 crc kubenswrapper[4775]: I0127 11:47:32.204185 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pbxcd" podStartSLOduration=2.388212512 podStartE2EDuration="7.204163302s" podCreationTimestamp="2026-01-27 11:47:25 +0000 UTC" firstStartedPulling="2026-01-27 11:47:27.117476098 +0000 UTC m=+1626.259073895" lastFinishedPulling="2026-01-27 11:47:31.933426908 +0000 UTC m=+1631.075024685" observedRunningTime="2026-01-27 11:47:32.195011613 +0000 UTC m=+1631.336609410" watchObservedRunningTime="2026-01-27 11:47:32.204163302 +0000 UTC m=+1631.345761089" Jan 27 11:47:35 crc kubenswrapper[4775]: I0127 11:47:35.884271 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:35 crc kubenswrapper[4775]: I0127 11:47:35.884950 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:36 crc kubenswrapper[4775]: I0127 11:47:36.036286 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-sd44h"] Jan 27 11:47:36 crc kubenswrapper[4775]: I0127 11:47:36.044623 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-sd44h"] Jan 27 11:47:36 crc kubenswrapper[4775]: I0127 11:47:36.924753 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pbxcd" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="registry-server" probeResult="failure" output=< Jan 27 11:47:36 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:47:36 crc kubenswrapper[4775]: > Jan 27 11:47:37 crc kubenswrapper[4775]: I0127 11:47:37.027207 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-99pzl"] Jan 27 11:47:37 crc kubenswrapper[4775]: I0127 11:47:37.034814 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-99pzl"] Jan 27 11:47:37 crc kubenswrapper[4775]: I0127 11:47:37.744549 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:47:37 crc kubenswrapper[4775]: E0127 11:47:37.744847 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:47:37 crc kubenswrapper[4775]: I0127 11:47:37.755226 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73aaf8f0-0380-4eff-875b-90da115dba37" path="/var/lib/kubelet/pods/73aaf8f0-0380-4eff-875b-90da115dba37/volumes" Jan 27 11:47:37 crc kubenswrapper[4775]: I0127 11:47:37.755809 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" path="/var/lib/kubelet/pods/ca5aab7c-3b7a-4996-82f5-478d4100bb6c/volumes" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.376184 4775 scope.go:117] "RemoveContainer" containerID="adceaeb3830c50c53d8853f905ea7baa2cdfc916451d3151ad053ea8bc41ca42" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.434816 4775 scope.go:117] "RemoveContainer" containerID="3fb6dba1ef6aef5504b2fb4bb7d21e98e86e3a8d11057b678b01d97ea7febc53" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.469393 4775 scope.go:117] "RemoveContainer" containerID="7c55ba28687b09e9f043ff5197811f82e94f5b15d3585bb9d84c0255945f85f2" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.489597 4775 scope.go:117] "RemoveContainer" containerID="ac331de51381335c4691ae4e98de7332a3c5743a5d6c666d5f05ad5b3c6fd004" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.542845 4775 scope.go:117] "RemoveContainer" containerID="876d516959295d7e0db711e27a3980ced858832560adced1e7a9b9f0d697bf7f" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.586930 4775 scope.go:117] "RemoveContainer" containerID="17411cc983dfc73db04ce363359c284ba977fc80d7b5112232e0f918ef68f140" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.610140 4775 scope.go:117] "RemoveContainer" containerID="9f638d9da6983bb9f837a053db11c7b530800ce81cdfc56efc5cba5e158a333e" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.679887 4775 scope.go:117] "RemoveContainer" containerID="60c3929eb191aa5a40f70277344a8ffb5cea8ddde6e12141b0847fb62fc4d0e9" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.699168 4775 scope.go:117] "RemoveContainer" containerID="680998a678e870e249e755477f30b2a4504f760bab8f79f38f76f47fa33c362f" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.717212 4775 scope.go:117] "RemoveContainer" containerID="ba2616ca5d5b886e0ddfe23c893276ccb71fe9923291902da4fa96d4180b8ef5" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.739711 4775 scope.go:117] "RemoveContainer" containerID="8e66e5156f741145dc91fb1f4f5c4dcef2ff5bbcecc942be3a86ad151ce0efd1" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.766938 4775 scope.go:117] "RemoveContainer" containerID="99a5cb170850c0b63e27c950fae2217adb226000e7879b0d85d00d895a615bdf" Jan 27 11:47:45 crc kubenswrapper[4775]: I0127 11:47:45.936771 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:45 crc kubenswrapper[4775]: I0127 11:47:45.983396 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:46 crc kubenswrapper[4775]: I0127 11:47:46.174910 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:47 crc kubenswrapper[4775]: I0127 11:47:47.308544 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pbxcd" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="registry-server" containerID="cri-o://e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1" gracePeriod=2 Jan 27 11:47:47 crc kubenswrapper[4775]: I0127 11:47:47.854555 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.023592 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8ldx\" (UniqueName: \"kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx\") pod \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.023815 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content\") pod \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.023867 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities\") pod \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.024812 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities" (OuterVolumeSpecName: "utilities") pod "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" (UID: "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.029969 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx" (OuterVolumeSpecName: "kube-api-access-z8ldx") pod "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" (UID: "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f"). InnerVolumeSpecName "kube-api-access-z8ldx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.126875 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.127317 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8ldx\" (UniqueName: \"kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx\") on node \"crc\" DevicePath \"\"" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.145222 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" (UID: "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.229045 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.319473 4775 generic.go:334] "Generic (PLEG): container finished" podID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerID="e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1" exitCode=0 Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.319584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerDied","Data":"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1"} Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.319734 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.320814 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerDied","Data":"c86625840bf41ea9181153767b3d3a82a3b86875abf42fbf6fc07a9e94beac5b"} Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.320836 4775 scope.go:117] "RemoveContainer" containerID="e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.361666 4775 scope.go:117] "RemoveContainer" containerID="533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.407615 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.415671 4775 scope.go:117] "RemoveContainer" containerID="d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.416788 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.447470 4775 scope.go:117] "RemoveContainer" containerID="e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1" Jan 27 11:47:48 crc kubenswrapper[4775]: E0127 11:47:48.447853 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1\": container with ID starting with e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1 not found: ID does not exist" containerID="e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.447899 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1"} err="failed to get container status \"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1\": rpc error: code = NotFound desc = could not find container \"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1\": container with ID starting with e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1 not found: ID does not exist" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.447927 4775 scope.go:117] "RemoveContainer" containerID="533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4" Jan 27 11:47:48 crc kubenswrapper[4775]: E0127 11:47:48.448120 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4\": container with ID starting with 533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4 not found: ID does not exist" containerID="533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.448148 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4"} err="failed to get container status \"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4\": rpc error: code = NotFound desc = could not find container \"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4\": container with ID starting with 533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4 not found: ID does not exist" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.448171 4775 scope.go:117] "RemoveContainer" containerID="d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0" Jan 27 11:47:48 crc kubenswrapper[4775]: E0127 11:47:48.448959 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0\": container with ID starting with d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0 not found: ID does not exist" containerID="d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.448985 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0"} err="failed to get container status \"d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0\": rpc error: code = NotFound desc = could not find container \"d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0\": container with ID starting with d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0 not found: ID does not exist" Jan 27 11:47:49 crc kubenswrapper[4775]: I0127 11:47:49.757405 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" path="/var/lib/kubelet/pods/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f/volumes" Jan 27 11:47:50 crc kubenswrapper[4775]: I0127 11:47:50.034528 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gcjrx"] Jan 27 11:47:50 crc kubenswrapper[4775]: I0127 11:47:50.042961 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gcjrx"] Jan 27 11:47:50 crc kubenswrapper[4775]: I0127 11:47:50.744520 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:47:50 crc kubenswrapper[4775]: E0127 11:47:50.744793 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:47:51 crc kubenswrapper[4775]: I0127 11:47:51.033348 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-74wvb"] Jan 27 11:47:51 crc kubenswrapper[4775]: I0127 11:47:51.040358 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-74wvb"] Jan 27 11:47:51 crc kubenswrapper[4775]: I0127 11:47:51.757170 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" path="/var/lib/kubelet/pods/5c313125-cfde-424b-9bb3-acb232d20ba3/volumes" Jan 27 11:47:51 crc kubenswrapper[4775]: I0127 11:47:51.758047 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba461ef4-49c1-4edc-ac60-1dfb91642c46" path="/var/lib/kubelet/pods/ba461ef4-49c1-4edc-ac60-1dfb91642c46/volumes" Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.046912 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xbnrk"] Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.057713 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-2nfbz"] Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.069434 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xbnrk"] Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.082574 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-2nfbz"] Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.427032 4775 generic.go:334] "Generic (PLEG): container finished" podID="e018489b-9445-4afb-8e4c-e9d52a6781d7" containerID="68fab69969ac252051443544579383fd831d8133c32cad9d9c4c67e6e0fe0911" exitCode=0 Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.427165 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" event={"ID":"e018489b-9445-4afb-8e4c-e9d52a6781d7","Type":"ContainerDied","Data":"68fab69969ac252051443544579383fd831d8133c32cad9d9c4c67e6e0fe0911"} Jan 27 11:48:01 crc kubenswrapper[4775]: I0127 11:48:01.759027 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" path="/var/lib/kubelet/pods/0edaeaa2-aa90-484f-854c-db5dd181f61b/volumes" Jan 27 11:48:01 crc kubenswrapper[4775]: I0127 11:48:01.760331 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" path="/var/lib/kubelet/pods/2029cc7b-c115-4c17-8713-c6eed291e963/volumes" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.046265 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.135906 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfdjz\" (UniqueName: \"kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz\") pod \"e018489b-9445-4afb-8e4c-e9d52a6781d7\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.136006 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory\") pod \"e018489b-9445-4afb-8e4c-e9d52a6781d7\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.136032 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam\") pod \"e018489b-9445-4afb-8e4c-e9d52a6781d7\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.143218 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz" (OuterVolumeSpecName: "kube-api-access-zfdjz") pod "e018489b-9445-4afb-8e4c-e9d52a6781d7" (UID: "e018489b-9445-4afb-8e4c-e9d52a6781d7"). InnerVolumeSpecName "kube-api-access-zfdjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.164196 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e018489b-9445-4afb-8e4c-e9d52a6781d7" (UID: "e018489b-9445-4afb-8e4c-e9d52a6781d7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.164641 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory" (OuterVolumeSpecName: "inventory") pod "e018489b-9445-4afb-8e4c-e9d52a6781d7" (UID: "e018489b-9445-4afb-8e4c-e9d52a6781d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.238307 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.238342 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.238354 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfdjz\" (UniqueName: \"kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz\") on node \"crc\" DevicePath \"\"" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.447610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" event={"ID":"e018489b-9445-4afb-8e4c-e9d52a6781d7","Type":"ContainerDied","Data":"3217b89f2dac713472e7b3a18905a4d7a31e80c6ce8d152dbe6348dc51a98d1a"} Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.447659 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3217b89f2dac713472e7b3a18905a4d7a31e80c6ce8d152dbe6348dc51a98d1a" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.447633 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.526527 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk"] Jan 27 11:48:02 crc kubenswrapper[4775]: E0127 11:48:02.527232 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="extract-content" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527250 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="extract-content" Jan 27 11:48:02 crc kubenswrapper[4775]: E0127 11:48:02.527267 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e018489b-9445-4afb-8e4c-e9d52a6781d7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527275 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e018489b-9445-4afb-8e4c-e9d52a6781d7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 27 11:48:02 crc kubenswrapper[4775]: E0127 11:48:02.527289 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="extract-utilities" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527295 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="extract-utilities" Jan 27 11:48:02 crc kubenswrapper[4775]: E0127 11:48:02.527305 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="registry-server" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527310 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="registry-server" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527507 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="registry-server" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527528 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e018489b-9445-4afb-8e4c-e9d52a6781d7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.528175 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.530879 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.530890 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.531165 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.531244 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.537223 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk"] Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.646434 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.646576 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8f6b\" (UniqueName: \"kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.646649 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.748463 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.748528 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8f6b\" (UniqueName: \"kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.748573 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.756682 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.758693 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.770536 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8f6b\" (UniqueName: \"kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.844807 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:03 crc kubenswrapper[4775]: I0127 11:48:03.364183 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk"] Jan 27 11:48:03 crc kubenswrapper[4775]: I0127 11:48:03.457916 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" event={"ID":"d688b7ee-365a-441b-a0ab-3d1cf6663988","Type":"ContainerStarted","Data":"36ef75fcf9ba98617a57f9808118484a931f51a9acb0eecb092d3540da321512"} Jan 27 11:48:03 crc kubenswrapper[4775]: I0127 11:48:03.745177 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:48:03 crc kubenswrapper[4775]: E0127 11:48:03.745551 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:48:04 crc kubenswrapper[4775]: I0127 11:48:04.467347 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" event={"ID":"d688b7ee-365a-441b-a0ab-3d1cf6663988","Type":"ContainerStarted","Data":"b580768fb47819c4080ebc1a9b28f6e0e4fb153c7d9ea8cd8313d656fe7197f8"} Jan 27 11:48:04 crc kubenswrapper[4775]: I0127 11:48:04.485193 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" podStartSLOduration=1.8738177660000002 podStartE2EDuration="2.485175982s" podCreationTimestamp="2026-01-27 11:48:02 +0000 UTC" firstStartedPulling="2026-01-27 11:48:03.364468459 +0000 UTC m=+1662.506066256" lastFinishedPulling="2026-01-27 11:48:03.975826695 +0000 UTC m=+1663.117424472" observedRunningTime="2026-01-27 11:48:04.482892842 +0000 UTC m=+1663.624490619" watchObservedRunningTime="2026-01-27 11:48:04.485175982 +0000 UTC m=+1663.626773759" Jan 27 11:48:14 crc kubenswrapper[4775]: I0127 11:48:14.745588 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:48:14 crc kubenswrapper[4775]: E0127 11:48:14.746375 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:48:29 crc kubenswrapper[4775]: I0127 11:48:29.745160 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:48:29 crc kubenswrapper[4775]: E0127 11:48:29.745924 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.040275 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6423-account-create-update-h7gvh"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.053147 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6423-account-create-update-h7gvh"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.061795 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-p9q28"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.070500 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tfv9j"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.078803 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tfv9j"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.091419 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-p9q28"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.107489 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-k4m7t"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.116120 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-k4m7t"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.759136 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027dfac2-8504-46aa-9302-19df71441688" path="/var/lib/kubelet/pods/027dfac2-8504-46aa-9302-19df71441688/volumes" Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.759986 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" path="/var/lib/kubelet/pods/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1/volumes" Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.760528 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03d69b1-c651-4b79-9ba1-581dc15737a6" path="/var/lib/kubelet/pods/b03d69b1-c651-4b79-9ba1-581dc15737a6/volumes" Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.761063 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6287027-2778-4115-b173-62b1600d0247" path="/var/lib/kubelet/pods/d6287027-2778-4115-b173-62b1600d0247/volumes" Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.027064 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8850-account-create-update-bwmll"] Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.036206 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8d66-account-create-update-qwzzn"] Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.043357 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8d66-account-create-update-qwzzn"] Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.050033 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8850-account-create-update-bwmll"] Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.757863 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18608c5-afda-4481-9c6d-a576dfd4d803" path="/var/lib/kubelet/pods/a18608c5-afda-4481-9c6d-a576dfd4d803/volumes" Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.759147 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeed29be-d561-4bf4-bdc1-c180e1983a3c" path="/var/lib/kubelet/pods/aeed29be-d561-4bf4-bdc1-c180e1983a3c/volumes" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.036785 4775 scope.go:117] "RemoveContainer" containerID="4caff9acfbabff5d43e064a2dae71d1faf921323c384955f825a0b026f90243f" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.062439 4775 scope.go:117] "RemoveContainer" containerID="f1da3c93241fe74774825dab64f2ef30084cf90829cd29690c1d5d1e607b82cf" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.114867 4775 scope.go:117] "RemoveContainer" containerID="41709560e0a135bfad172581c43697731478b69553f5d48646b5f6b88ba2d017" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.174189 4775 scope.go:117] "RemoveContainer" containerID="7ef3f2b53db6801d250b8f062a4c055cb74eb877a306cd9ed1f923e6a13337a5" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.203062 4775 scope.go:117] "RemoveContainer" containerID="d4146f8956305fcd5ed343f07c424f8688cf68dfdc28b629aab55c50f738bb32" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.284148 4775 scope.go:117] "RemoveContainer" containerID="3e39eecfe6e3fc9edcef832aba89c2b8bb839bad8f9d02052e6eb7c6e0e5266b" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.303478 4775 scope.go:117] "RemoveContainer" containerID="a3091380a3b190141025c92d1747551aef9bfe0d5a0a8fe21ec59422863e92d3" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.321133 4775 scope.go:117] "RemoveContainer" containerID="23b16c9948b130a40404980a7031b163bab9fc293057be41f8d97640f61ddc95" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.337234 4775 scope.go:117] "RemoveContainer" containerID="398c82449e605705da69d826d01f9e9fe98c4e413ef45b6f729de523bb9ad912" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.359822 4775 scope.go:117] "RemoveContainer" containerID="1a9f2ed09821cb7a2fc3a6a56f74a7c65b7d39b4dfff4c1c07be78b154a6894c" Jan 27 11:48:44 crc kubenswrapper[4775]: I0127 11:48:44.745821 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:48:44 crc kubenswrapper[4775]: E0127 11:48:44.747094 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:48:57 crc kubenswrapper[4775]: I0127 11:48:57.747551 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:48:57 crc kubenswrapper[4775]: E0127 11:48:57.748426 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:49:08 crc kubenswrapper[4775]: I0127 11:49:08.745598 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:49:08 crc kubenswrapper[4775]: E0127 11:49:08.746512 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:49:12 crc kubenswrapper[4775]: I0127 11:49:12.036588 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bh7g"] Jan 27 11:49:12 crc kubenswrapper[4775]: I0127 11:49:12.044977 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bh7g"] Jan 27 11:49:13 crc kubenswrapper[4775]: I0127 11:49:13.754444 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5e7b0a-a4d0-4c64-b273-2b47230efd17" path="/var/lib/kubelet/pods/1b5e7b0a-a4d0-4c64-b273-2b47230efd17/volumes" Jan 27 11:49:15 crc kubenswrapper[4775]: I0127 11:49:15.094823 4775 generic.go:334] "Generic (PLEG): container finished" podID="d688b7ee-365a-441b-a0ab-3d1cf6663988" containerID="b580768fb47819c4080ebc1a9b28f6e0e4fb153c7d9ea8cd8313d656fe7197f8" exitCode=0 Jan 27 11:49:15 crc kubenswrapper[4775]: I0127 11:49:15.094909 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" event={"ID":"d688b7ee-365a-441b-a0ab-3d1cf6663988","Type":"ContainerDied","Data":"b580768fb47819c4080ebc1a9b28f6e0e4fb153c7d9ea8cd8313d656fe7197f8"} Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.555142 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.691901 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory\") pod \"d688b7ee-365a-441b-a0ab-3d1cf6663988\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.691977 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8f6b\" (UniqueName: \"kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b\") pod \"d688b7ee-365a-441b-a0ab-3d1cf6663988\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.692029 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam\") pod \"d688b7ee-365a-441b-a0ab-3d1cf6663988\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.697317 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b" (OuterVolumeSpecName: "kube-api-access-m8f6b") pod "d688b7ee-365a-441b-a0ab-3d1cf6663988" (UID: "d688b7ee-365a-441b-a0ab-3d1cf6663988"). InnerVolumeSpecName "kube-api-access-m8f6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.717212 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory" (OuterVolumeSpecName: "inventory") pod "d688b7ee-365a-441b-a0ab-3d1cf6663988" (UID: "d688b7ee-365a-441b-a0ab-3d1cf6663988"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.721220 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d688b7ee-365a-441b-a0ab-3d1cf6663988" (UID: "d688b7ee-365a-441b-a0ab-3d1cf6663988"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.795266 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.795301 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8f6b\" (UniqueName: \"kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.795312 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.115000 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" event={"ID":"d688b7ee-365a-441b-a0ab-3d1cf6663988","Type":"ContainerDied","Data":"36ef75fcf9ba98617a57f9808118484a931f51a9acb0eecb092d3540da321512"} Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.115034 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36ef75fcf9ba98617a57f9808118484a931f51a9acb0eecb092d3540da321512" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.115487 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.190194 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8"] Jan 27 11:49:17 crc kubenswrapper[4775]: E0127 11:49:17.190590 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d688b7ee-365a-441b-a0ab-3d1cf6663988" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.190606 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d688b7ee-365a-441b-a0ab-3d1cf6663988" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.190793 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d688b7ee-365a-441b-a0ab-3d1cf6663988" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.191369 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.194335 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.194563 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.194616 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.194869 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.202492 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mzr\" (UniqueName: \"kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.202549 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.202689 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.203225 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8"] Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.304696 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.304832 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4mzr\" (UniqueName: \"kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.304865 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.309688 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.312550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.320704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4mzr\" (UniqueName: \"kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.510146 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.993073 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8"] Jan 27 11:49:18 crc kubenswrapper[4775]: I0127 11:49:18.122791 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" event={"ID":"6b092f27-cfd0-4c25-beab-c347f14371a1","Type":"ContainerStarted","Data":"dee58003ba4ce76b4b6a42e673be463081fac00f8233ca93ab2ca8a8c19ca705"} Jan 27 11:49:19 crc kubenswrapper[4775]: I0127 11:49:19.131960 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" event={"ID":"6b092f27-cfd0-4c25-beab-c347f14371a1","Type":"ContainerStarted","Data":"d047872ce4a2bc067dcda261ffd4a61fc03ddf035c94d7e8d87fa6cdc8f416c1"} Jan 27 11:49:19 crc kubenswrapper[4775]: I0127 11:49:19.148260 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" podStartSLOduration=1.64297764 podStartE2EDuration="2.14822633s" podCreationTimestamp="2026-01-27 11:49:17 +0000 UTC" firstStartedPulling="2026-01-27 11:49:17.999888833 +0000 UTC m=+1737.141486610" lastFinishedPulling="2026-01-27 11:49:18.505137513 +0000 UTC m=+1737.646735300" observedRunningTime="2026-01-27 11:49:19.144818131 +0000 UTC m=+1738.286415908" watchObservedRunningTime="2026-01-27 11:49:19.14822633 +0000 UTC m=+1738.289824107" Jan 27 11:49:19 crc kubenswrapper[4775]: I0127 11:49:19.745252 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:49:19 crc kubenswrapper[4775]: E0127 11:49:19.745517 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:49:24 crc kubenswrapper[4775]: I0127 11:49:24.177026 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b092f27-cfd0-4c25-beab-c347f14371a1" containerID="d047872ce4a2bc067dcda261ffd4a61fc03ddf035c94d7e8d87fa6cdc8f416c1" exitCode=0 Jan 27 11:49:24 crc kubenswrapper[4775]: I0127 11:49:24.177137 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" event={"ID":"6b092f27-cfd0-4c25-beab-c347f14371a1","Type":"ContainerDied","Data":"d047872ce4a2bc067dcda261ffd4a61fc03ddf035c94d7e8d87fa6cdc8f416c1"} Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.661281 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.766005 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam\") pod \"6b092f27-cfd0-4c25-beab-c347f14371a1\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.766117 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4mzr\" (UniqueName: \"kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr\") pod \"6b092f27-cfd0-4c25-beab-c347f14371a1\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.766186 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory\") pod \"6b092f27-cfd0-4c25-beab-c347f14371a1\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.771731 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr" (OuterVolumeSpecName: "kube-api-access-l4mzr") pod "6b092f27-cfd0-4c25-beab-c347f14371a1" (UID: "6b092f27-cfd0-4c25-beab-c347f14371a1"). InnerVolumeSpecName "kube-api-access-l4mzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.794534 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6b092f27-cfd0-4c25-beab-c347f14371a1" (UID: "6b092f27-cfd0-4c25-beab-c347f14371a1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.816350 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory" (OuterVolumeSpecName: "inventory") pod "6b092f27-cfd0-4c25-beab-c347f14371a1" (UID: "6b092f27-cfd0-4c25-beab-c347f14371a1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.870237 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.870283 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4mzr\" (UniqueName: \"kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.870297 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.204201 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" event={"ID":"6b092f27-cfd0-4c25-beab-c347f14371a1","Type":"ContainerDied","Data":"dee58003ba4ce76b4b6a42e673be463081fac00f8233ca93ab2ca8a8c19ca705"} Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.204263 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dee58003ba4ce76b4b6a42e673be463081fac00f8233ca93ab2ca8a8c19ca705" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.204745 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.354147 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z"] Jan 27 11:49:26 crc kubenswrapper[4775]: E0127 11:49:26.354612 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b092f27-cfd0-4c25-beab-c347f14371a1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.354640 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b092f27-cfd0-4c25-beab-c347f14371a1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.354904 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b092f27-cfd0-4c25-beab-c347f14371a1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.355693 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.357762 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.358101 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.358636 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.358677 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.379113 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z"] Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.379303 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dqcs\" (UniqueName: \"kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.379466 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.379564 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.481194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dqcs\" (UniqueName: \"kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.481317 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.481385 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.485357 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.485502 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.507798 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dqcs\" (UniqueName: \"kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.671999 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:27 crc kubenswrapper[4775]: I0127 11:49:27.220986 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z"] Jan 27 11:49:28 crc kubenswrapper[4775]: I0127 11:49:28.224689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" event={"ID":"2a28c09e-4891-433d-a745-f3dcfc8654aa","Type":"ContainerStarted","Data":"42ce5cf829a9ef85991d620c248ee5b5c57c48506d27e8351a295722eb4a200d"} Jan 27 11:49:28 crc kubenswrapper[4775]: I0127 11:49:28.224975 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" event={"ID":"2a28c09e-4891-433d-a745-f3dcfc8654aa","Type":"ContainerStarted","Data":"a11cee897461bda69ca78f33345efedd73ce30c8550e139573da4d22c682184b"} Jan 27 11:49:28 crc kubenswrapper[4775]: I0127 11:49:28.249797 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" podStartSLOduration=1.8414507 podStartE2EDuration="2.249777084s" podCreationTimestamp="2026-01-27 11:49:26 +0000 UTC" firstStartedPulling="2026-01-27 11:49:27.229461797 +0000 UTC m=+1746.371059574" lastFinishedPulling="2026-01-27 11:49:27.637788141 +0000 UTC m=+1746.779385958" observedRunningTime="2026-01-27 11:49:28.239565297 +0000 UTC m=+1747.381163074" watchObservedRunningTime="2026-01-27 11:49:28.249777084 +0000 UTC m=+1747.391374861" Jan 27 11:49:34 crc kubenswrapper[4775]: I0127 11:49:34.744699 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:49:34 crc kubenswrapper[4775]: E0127 11:49:34.746505 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:49:41 crc kubenswrapper[4775]: I0127 11:49:41.556402 4775 scope.go:117] "RemoveContainer" containerID="cd7130b87032009eafbd9299811458b2c0b7a08141bac0e7bfbe791fc49ad4d0" Jan 27 11:49:46 crc kubenswrapper[4775]: I0127 11:49:46.745295 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:49:46 crc kubenswrapper[4775]: E0127 11:49:46.746514 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:49:58 crc kubenswrapper[4775]: I0127 11:49:58.749812 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:49:58 crc kubenswrapper[4775]: E0127 11:49:58.752654 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:50:07 crc kubenswrapper[4775]: I0127 11:50:07.581058 4775 generic.go:334] "Generic (PLEG): container finished" podID="2a28c09e-4891-433d-a745-f3dcfc8654aa" containerID="42ce5cf829a9ef85991d620c248ee5b5c57c48506d27e8351a295722eb4a200d" exitCode=0 Jan 27 11:50:07 crc kubenswrapper[4775]: I0127 11:50:07.581186 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" event={"ID":"2a28c09e-4891-433d-a745-f3dcfc8654aa","Type":"ContainerDied","Data":"42ce5cf829a9ef85991d620c248ee5b5c57c48506d27e8351a295722eb4a200d"} Jan 27 11:50:08 crc kubenswrapper[4775]: I0127 11:50:08.042756 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-m2t9b"] Jan 27 11:50:08 crc kubenswrapper[4775]: I0127 11:50:08.055597 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-m2t9b"] Jan 27 11:50:08 crc kubenswrapper[4775]: I0127 11:50:08.990176 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.037863 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xh4b2"] Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.043799 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xh4b2"] Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.178634 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam\") pod \"2a28c09e-4891-433d-a745-f3dcfc8654aa\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.178705 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dqcs\" (UniqueName: \"kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs\") pod \"2a28c09e-4891-433d-a745-f3dcfc8654aa\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.178792 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory\") pod \"2a28c09e-4891-433d-a745-f3dcfc8654aa\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.185200 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs" (OuterVolumeSpecName: "kube-api-access-2dqcs") pod "2a28c09e-4891-433d-a745-f3dcfc8654aa" (UID: "2a28c09e-4891-433d-a745-f3dcfc8654aa"). InnerVolumeSpecName "kube-api-access-2dqcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.216674 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2a28c09e-4891-433d-a745-f3dcfc8654aa" (UID: "2a28c09e-4891-433d-a745-f3dcfc8654aa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.218644 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory" (OuterVolumeSpecName: "inventory") pod "2a28c09e-4891-433d-a745-f3dcfc8654aa" (UID: "2a28c09e-4891-433d-a745-f3dcfc8654aa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.281984 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.282174 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.282206 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dqcs\" (UniqueName: \"kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs\") on node \"crc\" DevicePath \"\"" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.597247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" event={"ID":"2a28c09e-4891-433d-a745-f3dcfc8654aa","Type":"ContainerDied","Data":"a11cee897461bda69ca78f33345efedd73ce30c8550e139573da4d22c682184b"} Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.597296 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a11cee897461bda69ca78f33345efedd73ce30c8550e139573da4d22c682184b" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.597311 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.689267 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb"] Jan 27 11:50:09 crc kubenswrapper[4775]: E0127 11:50:09.690125 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a28c09e-4891-433d-a745-f3dcfc8654aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.690159 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a28c09e-4891-433d-a745-f3dcfc8654aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.690611 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a28c09e-4891-433d-a745-f3dcfc8654aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.691403 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.695161 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.695299 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.695595 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.695715 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.700637 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb"] Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.754135 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8726531a-a74e-48cd-a274-6f67ae507560" path="/var/lib/kubelet/pods/8726531a-a74e-48cd-a274-6f67ae507560/volumes" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.754855 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3942760-c6b4-43b5-9680-48d8b8ae3854" path="/var/lib/kubelet/pods/a3942760-c6b4-43b5-9680-48d8b8ae3854/volumes" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.790117 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.790173 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhblj\" (UniqueName: \"kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.790470 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.892806 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhblj\" (UniqueName: \"kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.893087 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.894271 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.899398 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.902763 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.911632 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhblj\" (UniqueName: \"kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:10 crc kubenswrapper[4775]: I0127 11:50:10.023700 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:10 crc kubenswrapper[4775]: I0127 11:50:10.694836 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb"] Jan 27 11:50:11 crc kubenswrapper[4775]: I0127 11:50:11.612160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" event={"ID":"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3","Type":"ContainerStarted","Data":"d673605088dadc0d4c3014041ac36f277af63b5e907303ca06a0df62c8850fed"} Jan 27 11:50:11 crc kubenswrapper[4775]: I0127 11:50:11.612677 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" event={"ID":"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3","Type":"ContainerStarted","Data":"4c29239fe93b78dbaba46f5c2a3db15797113fc41cf5f228555eae27949deb8d"} Jan 27 11:50:11 crc kubenswrapper[4775]: I0127 11:50:11.636598 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" podStartSLOduration=2.089282853 podStartE2EDuration="2.636572877s" podCreationTimestamp="2026-01-27 11:50:09 +0000 UTC" firstStartedPulling="2026-01-27 11:50:10.697470211 +0000 UTC m=+1789.839067988" lastFinishedPulling="2026-01-27 11:50:11.244760245 +0000 UTC m=+1790.386358012" observedRunningTime="2026-01-27 11:50:11.631111598 +0000 UTC m=+1790.772709395" watchObservedRunningTime="2026-01-27 11:50:11.636572877 +0000 UTC m=+1790.778170654" Jan 27 11:50:13 crc kubenswrapper[4775]: I0127 11:50:13.745231 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:50:13 crc kubenswrapper[4775]: E0127 11:50:13.745747 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:50:25 crc kubenswrapper[4775]: I0127 11:50:25.744857 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:50:25 crc kubenswrapper[4775]: E0127 11:50:25.745745 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:50:40 crc kubenswrapper[4775]: I0127 11:50:40.745505 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:50:40 crc kubenswrapper[4775]: E0127 11:50:40.746698 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:50:41 crc kubenswrapper[4775]: I0127 11:50:41.636584 4775 scope.go:117] "RemoveContainer" containerID="7a301f6fdbdbc7fba26fdec2032cb9599d38e17acf3b3627d4e654dc3bc0fdb7" Jan 27 11:50:41 crc kubenswrapper[4775]: I0127 11:50:41.688278 4775 scope.go:117] "RemoveContainer" containerID="b754699b4de85074b5e141a6f2ae8704aa4f96f92dca88cac7a93ee7f041781e" Jan 27 11:50:51 crc kubenswrapper[4775]: I0127 11:50:51.745391 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:50:51 crc kubenswrapper[4775]: E0127 11:50:51.746232 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:50:52 crc kubenswrapper[4775]: I0127 11:50:52.049202 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4lnkz"] Jan 27 11:50:52 crc kubenswrapper[4775]: I0127 11:50:52.067969 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4lnkz"] Jan 27 11:50:53 crc kubenswrapper[4775]: I0127 11:50:53.756040 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b77cbe7c-5901-44d2-959f-5435b8adbc85" path="/var/lib/kubelet/pods/b77cbe7c-5901-44d2-959f-5435b8adbc85/volumes" Jan 27 11:51:05 crc kubenswrapper[4775]: I0127 11:51:05.745344 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:51:05 crc kubenswrapper[4775]: E0127 11:51:05.746412 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:51:06 crc kubenswrapper[4775]: I0127 11:51:06.125304 4775 generic.go:334] "Generic (PLEG): container finished" podID="a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" containerID="d673605088dadc0d4c3014041ac36f277af63b5e907303ca06a0df62c8850fed" exitCode=0 Jan 27 11:51:06 crc kubenswrapper[4775]: I0127 11:51:06.125371 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" event={"ID":"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3","Type":"ContainerDied","Data":"d673605088dadc0d4c3014041ac36f277af63b5e907303ca06a0df62c8850fed"} Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.551969 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.663199 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory\") pod \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.663380 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhblj\" (UniqueName: \"kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj\") pod \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.663580 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam\") pod \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.669256 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj" (OuterVolumeSpecName: "kube-api-access-dhblj") pod "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" (UID: "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3"). InnerVolumeSpecName "kube-api-access-dhblj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.689199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory" (OuterVolumeSpecName: "inventory") pod "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" (UID: "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.691282 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" (UID: "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.766145 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.766176 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhblj\" (UniqueName: \"kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.766188 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.143190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" event={"ID":"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3","Type":"ContainerDied","Data":"4c29239fe93b78dbaba46f5c2a3db15797113fc41cf5f228555eae27949deb8d"} Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.143233 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c29239fe93b78dbaba46f5c2a3db15797113fc41cf5f228555eae27949deb8d" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.143240 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.237951 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r78nv"] Jan 27 11:51:08 crc kubenswrapper[4775]: E0127 11:51:08.238359 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.238379 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.238562 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.239183 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.245116 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.245181 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.245256 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.245402 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.254136 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r78nv"] Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.276473 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.277103 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzsm\" (UniqueName: \"kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.277151 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.378908 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.379028 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzsm\" (UniqueName: \"kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.379079 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.385705 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.385777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.394894 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzsm\" (UniqueName: \"kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.594112 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:09 crc kubenswrapper[4775]: I0127 11:51:09.110572 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r78nv"] Jan 27 11:51:09 crc kubenswrapper[4775]: I0127 11:51:09.111739 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:51:09 crc kubenswrapper[4775]: I0127 11:51:09.151265 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" event={"ID":"28d386bc-d48d-41e0-9ae2-bbe8f876ba10","Type":"ContainerStarted","Data":"31bc709272dd855fab3ad0897347bb4062288a5eee1244b5c251cd317220ea91"} Jan 27 11:51:10 crc kubenswrapper[4775]: I0127 11:51:10.160238 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" event={"ID":"28d386bc-d48d-41e0-9ae2-bbe8f876ba10","Type":"ContainerStarted","Data":"3e083d25e1102e6027f1153ce0b29f2478aa3c2ac91d859004a591b76193f8f6"} Jan 27 11:51:10 crc kubenswrapper[4775]: I0127 11:51:10.181993 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" podStartSLOduration=1.6244914879999999 podStartE2EDuration="2.181976121s" podCreationTimestamp="2026-01-27 11:51:08 +0000 UTC" firstStartedPulling="2026-01-27 11:51:09.111528168 +0000 UTC m=+1848.253125945" lastFinishedPulling="2026-01-27 11:51:09.669012801 +0000 UTC m=+1848.810610578" observedRunningTime="2026-01-27 11:51:10.173900361 +0000 UTC m=+1849.315498138" watchObservedRunningTime="2026-01-27 11:51:10.181976121 +0000 UTC m=+1849.323573898" Jan 27 11:51:17 crc kubenswrapper[4775]: I0127 11:51:17.211372 4775 generic.go:334] "Generic (PLEG): container finished" podID="28d386bc-d48d-41e0-9ae2-bbe8f876ba10" containerID="3e083d25e1102e6027f1153ce0b29f2478aa3c2ac91d859004a591b76193f8f6" exitCode=0 Jan 27 11:51:17 crc kubenswrapper[4775]: I0127 11:51:17.211441 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" event={"ID":"28d386bc-d48d-41e0-9ae2-bbe8f876ba10","Type":"ContainerDied","Data":"3e083d25e1102e6027f1153ce0b29f2478aa3c2ac91d859004a591b76193f8f6"} Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.694126 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.770718 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam\") pod \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.770789 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvzsm\" (UniqueName: \"kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm\") pod \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.770856 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0\") pod \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.786705 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm" (OuterVolumeSpecName: "kube-api-access-wvzsm") pod "28d386bc-d48d-41e0-9ae2-bbe8f876ba10" (UID: "28d386bc-d48d-41e0-9ae2-bbe8f876ba10"). InnerVolumeSpecName "kube-api-access-wvzsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.799771 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28d386bc-d48d-41e0-9ae2-bbe8f876ba10" (UID: "28d386bc-d48d-41e0-9ae2-bbe8f876ba10"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.802647 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "28d386bc-d48d-41e0-9ae2-bbe8f876ba10" (UID: "28d386bc-d48d-41e0-9ae2-bbe8f876ba10"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.872847 4775 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.873114 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.873126 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvzsm\" (UniqueName: \"kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.240842 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" event={"ID":"28d386bc-d48d-41e0-9ae2-bbe8f876ba10","Type":"ContainerDied","Data":"31bc709272dd855fab3ad0897347bb4062288a5eee1244b5c251cd317220ea91"} Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.240891 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.240892 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31bc709272dd855fab3ad0897347bb4062288a5eee1244b5c251cd317220ea91" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.310856 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b"] Jan 27 11:51:19 crc kubenswrapper[4775]: E0127 11:51:19.311230 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d386bc-d48d-41e0-9ae2-bbe8f876ba10" containerName="ssh-known-hosts-edpm-deployment" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.311247 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d386bc-d48d-41e0-9ae2-bbe8f876ba10" containerName="ssh-known-hosts-edpm-deployment" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.311463 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d386bc-d48d-41e0-9ae2-bbe8f876ba10" containerName="ssh-known-hosts-edpm-deployment" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.312145 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.314662 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.314763 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.315600 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.315806 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.327155 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b"] Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.380348 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.380427 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbqvh\" (UniqueName: \"kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.380539 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.482656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.482760 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbqvh\" (UniqueName: \"kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.482832 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.487285 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.488522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.499011 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbqvh\" (UniqueName: \"kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.631932 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:20 crc kubenswrapper[4775]: I0127 11:51:20.133913 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b"] Jan 27 11:51:20 crc kubenswrapper[4775]: I0127 11:51:20.250251 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" event={"ID":"f349798f-861c-4071-b418-61fe20227133","Type":"ContainerStarted","Data":"6fc84b175910104cc41f6779c5768ecbeadb244f51fad2bfffc6f93fa6a06bc9"} Jan 27 11:51:20 crc kubenswrapper[4775]: I0127 11:51:20.745637 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:51:20 crc kubenswrapper[4775]: E0127 11:51:20.746045 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:51:21 crc kubenswrapper[4775]: I0127 11:51:21.263786 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" event={"ID":"f349798f-861c-4071-b418-61fe20227133","Type":"ContainerStarted","Data":"42178019447f7257f5e008e9df173ca2c966588c5dbfd4f8b641c22ae15cc2fe"} Jan 27 11:51:21 crc kubenswrapper[4775]: I0127 11:51:21.293843 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" podStartSLOduration=1.868809076 podStartE2EDuration="2.293821812s" podCreationTimestamp="2026-01-27 11:51:19 +0000 UTC" firstStartedPulling="2026-01-27 11:51:20.142681141 +0000 UTC m=+1859.284278918" lastFinishedPulling="2026-01-27 11:51:20.567693867 +0000 UTC m=+1859.709291654" observedRunningTime="2026-01-27 11:51:21.28452596 +0000 UTC m=+1860.426123737" watchObservedRunningTime="2026-01-27 11:51:21.293821812 +0000 UTC m=+1860.435419589" Jan 27 11:51:29 crc kubenswrapper[4775]: I0127 11:51:29.360644 4775 generic.go:334] "Generic (PLEG): container finished" podID="f349798f-861c-4071-b418-61fe20227133" containerID="42178019447f7257f5e008e9df173ca2c966588c5dbfd4f8b641c22ae15cc2fe" exitCode=0 Jan 27 11:51:29 crc kubenswrapper[4775]: I0127 11:51:29.360654 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" event={"ID":"f349798f-861c-4071-b418-61fe20227133","Type":"ContainerDied","Data":"42178019447f7257f5e008e9df173ca2c966588c5dbfd4f8b641c22ae15cc2fe"} Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.811040 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.922210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam\") pod \"f349798f-861c-4071-b418-61fe20227133\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.922480 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbqvh\" (UniqueName: \"kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh\") pod \"f349798f-861c-4071-b418-61fe20227133\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.922605 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory\") pod \"f349798f-861c-4071-b418-61fe20227133\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.928891 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh" (OuterVolumeSpecName: "kube-api-access-zbqvh") pod "f349798f-861c-4071-b418-61fe20227133" (UID: "f349798f-861c-4071-b418-61fe20227133"). InnerVolumeSpecName "kube-api-access-zbqvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.948036 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory" (OuterVolumeSpecName: "inventory") pod "f349798f-861c-4071-b418-61fe20227133" (UID: "f349798f-861c-4071-b418-61fe20227133"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.950352 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f349798f-861c-4071-b418-61fe20227133" (UID: "f349798f-861c-4071-b418-61fe20227133"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.024379 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbqvh\" (UniqueName: \"kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.024410 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.024419 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.386198 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" event={"ID":"f349798f-861c-4071-b418-61fe20227133","Type":"ContainerDied","Data":"6fc84b175910104cc41f6779c5768ecbeadb244f51fad2bfffc6f93fa6a06bc9"} Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.386650 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fc84b175910104cc41f6779c5768ecbeadb244f51fad2bfffc6f93fa6a06bc9" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.386271 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.478174 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw"] Jan 27 11:51:31 crc kubenswrapper[4775]: E0127 11:51:31.478750 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f349798f-861c-4071-b418-61fe20227133" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.478775 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f349798f-861c-4071-b418-61fe20227133" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.479015 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f349798f-861c-4071-b418-61fe20227133" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.479852 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.481933 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.482076 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.482116 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.482138 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.500780 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw"] Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.639294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.639669 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.639851 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf5tw\" (UniqueName: \"kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.742815 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.742962 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf5tw\" (UniqueName: \"kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.743106 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.748764 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.748944 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.776732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf5tw\" (UniqueName: \"kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.796785 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:32 crc kubenswrapper[4775]: I0127 11:51:32.344046 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw"] Jan 27 11:51:32 crc kubenswrapper[4775]: I0127 11:51:32.395715 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" event={"ID":"ca771db8-558f-4e69-ba8c-37ed97f534b4","Type":"ContainerStarted","Data":"578c70f55017da3c24435deada18b1e5d205c9d599812df233628b270b420fe7"} Jan 27 11:51:33 crc kubenswrapper[4775]: I0127 11:51:33.405213 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" event={"ID":"ca771db8-558f-4e69-ba8c-37ed97f534b4","Type":"ContainerStarted","Data":"9f2013e59aa1a93a24aa1037b9603d6d08abdaa914435e6bc02746de27615738"} Jan 27 11:51:33 crc kubenswrapper[4775]: I0127 11:51:33.425345 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" podStartSLOduration=2.025447413 podStartE2EDuration="2.425328304s" podCreationTimestamp="2026-01-27 11:51:31 +0000 UTC" firstStartedPulling="2026-01-27 11:51:32.346633115 +0000 UTC m=+1871.488230892" lastFinishedPulling="2026-01-27 11:51:32.746514006 +0000 UTC m=+1871.888111783" observedRunningTime="2026-01-27 11:51:33.419054292 +0000 UTC m=+1872.560652069" watchObservedRunningTime="2026-01-27 11:51:33.425328304 +0000 UTC m=+1872.566926081" Jan 27 11:51:35 crc kubenswrapper[4775]: I0127 11:51:35.746313 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:51:36 crc kubenswrapper[4775]: I0127 11:51:36.430930 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753"} Jan 27 11:51:41 crc kubenswrapper[4775]: I0127 11:51:41.812198 4775 scope.go:117] "RemoveContainer" containerID="fee7236fa11e516e48176ea4ac10ecf99f92b8a3df878c241be649e46d2bcbab" Jan 27 11:51:42 crc kubenswrapper[4775]: I0127 11:51:42.480337 4775 generic.go:334] "Generic (PLEG): container finished" podID="ca771db8-558f-4e69-ba8c-37ed97f534b4" containerID="9f2013e59aa1a93a24aa1037b9603d6d08abdaa914435e6bc02746de27615738" exitCode=0 Jan 27 11:51:42 crc kubenswrapper[4775]: I0127 11:51:42.480378 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" event={"ID":"ca771db8-558f-4e69-ba8c-37ed97f534b4","Type":"ContainerDied","Data":"9f2013e59aa1a93a24aa1037b9603d6d08abdaa914435e6bc02746de27615738"} Jan 27 11:51:43 crc kubenswrapper[4775]: I0127 11:51:43.902612 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:43 crc kubenswrapper[4775]: I0127 11:51:43.980912 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam\") pod \"ca771db8-558f-4e69-ba8c-37ed97f534b4\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " Jan 27 11:51:43 crc kubenswrapper[4775]: I0127 11:51:43.981369 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory\") pod \"ca771db8-558f-4e69-ba8c-37ed97f534b4\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " Jan 27 11:51:43 crc kubenswrapper[4775]: I0127 11:51:43.981476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf5tw\" (UniqueName: \"kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw\") pod \"ca771db8-558f-4e69-ba8c-37ed97f534b4\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " Jan 27 11:51:43 crc kubenswrapper[4775]: I0127 11:51:43.986861 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw" (OuterVolumeSpecName: "kube-api-access-jf5tw") pod "ca771db8-558f-4e69-ba8c-37ed97f534b4" (UID: "ca771db8-558f-4e69-ba8c-37ed97f534b4"). InnerVolumeSpecName "kube-api-access-jf5tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.008599 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory" (OuterVolumeSpecName: "inventory") pod "ca771db8-558f-4e69-ba8c-37ed97f534b4" (UID: "ca771db8-558f-4e69-ba8c-37ed97f534b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.010271 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ca771db8-558f-4e69-ba8c-37ed97f534b4" (UID: "ca771db8-558f-4e69-ba8c-37ed97f534b4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.084586 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.084636 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.084648 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf5tw\" (UniqueName: \"kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.497758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" event={"ID":"ca771db8-558f-4e69-ba8c-37ed97f534b4","Type":"ContainerDied","Data":"578c70f55017da3c24435deada18b1e5d205c9d599812df233628b270b420fe7"} Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.497815 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="578c70f55017da3c24435deada18b1e5d205c9d599812df233628b270b420fe7" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.497819 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.588791 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf"] Jan 27 11:51:44 crc kubenswrapper[4775]: E0127 11:51:44.589290 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca771db8-558f-4e69-ba8c-37ed97f534b4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.589315 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca771db8-558f-4e69-ba8c-37ed97f534b4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.589593 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca771db8-558f-4e69-ba8c-37ed97f534b4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.590420 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.592556 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.593039 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.593083 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.593208 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.593282 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.593408 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.595923 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.599751 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.601665 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf"] Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696313 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696407 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696623 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696829 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brh28\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696912 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696960 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697134 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697211 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697232 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697282 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697309 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697336 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798643 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798727 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798761 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brh28\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798821 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798856 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798899 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798926 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798947 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798973 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.799013 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.799033 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.799064 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.803873 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.805482 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.807297 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.808927 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.808960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.809799 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.809858 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.810144 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.810287 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.811242 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.813135 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.814287 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.817955 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.818663 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brh28\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.915497 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:45 crc kubenswrapper[4775]: I0127 11:51:45.425317 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf"] Jan 27 11:51:45 crc kubenswrapper[4775]: I0127 11:51:45.507112 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" event={"ID":"d002bd2d-2dcd-4ba3-841b-1306c023469b","Type":"ContainerStarted","Data":"b7c09cabc878d4d20c5d5e32768ca7f091ebce91f8993f1d3243f457d5f3df35"} Jan 27 11:51:46 crc kubenswrapper[4775]: I0127 11:51:46.517903 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" event={"ID":"d002bd2d-2dcd-4ba3-841b-1306c023469b","Type":"ContainerStarted","Data":"539928bf723e33e91c76ecd68a410f9ee0c444d91e281999babed305b374d93e"} Jan 27 11:51:46 crc kubenswrapper[4775]: I0127 11:51:46.544314 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" podStartSLOduration=2.104014596 podStartE2EDuration="2.544289561s" podCreationTimestamp="2026-01-27 11:51:44 +0000 UTC" firstStartedPulling="2026-01-27 11:51:45.429873187 +0000 UTC m=+1884.571470964" lastFinishedPulling="2026-01-27 11:51:45.870148152 +0000 UTC m=+1885.011745929" observedRunningTime="2026-01-27 11:51:46.53458011 +0000 UTC m=+1885.676177907" watchObservedRunningTime="2026-01-27 11:51:46.544289561 +0000 UTC m=+1885.685887338" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.090327 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.092565 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.110857 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.209376 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.209509 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.209754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff85z\" (UniqueName: \"kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.311507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.311653 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.311744 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff85z\" (UniqueName: \"kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.312016 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.312490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.332408 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff85z\" (UniqueName: \"kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.418754 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.924039 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:04 crc kubenswrapper[4775]: W0127 11:52:04.927073 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03479aab_2cb4_4bf4_b59d_399b66bdff65.slice/crio-28ad4b62192fe9ba584aabf8612666c2c71c19a2a3d532eb95f5bb71b89dd6d8 WatchSource:0}: Error finding container 28ad4b62192fe9ba584aabf8612666c2c71c19a2a3d532eb95f5bb71b89dd6d8: Status 404 returned error can't find the container with id 28ad4b62192fe9ba584aabf8612666c2c71c19a2a3d532eb95f5bb71b89dd6d8 Jan 27 11:52:05 crc kubenswrapper[4775]: I0127 11:52:05.684702 4775 generic.go:334] "Generic (PLEG): container finished" podID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerID="60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444" exitCode=0 Jan 27 11:52:05 crc kubenswrapper[4775]: I0127 11:52:05.684759 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerDied","Data":"60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444"} Jan 27 11:52:05 crc kubenswrapper[4775]: I0127 11:52:05.685114 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerStarted","Data":"28ad4b62192fe9ba584aabf8612666c2c71c19a2a3d532eb95f5bb71b89dd6d8"} Jan 27 11:52:06 crc kubenswrapper[4775]: I0127 11:52:06.697320 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerStarted","Data":"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8"} Jan 27 11:52:07 crc kubenswrapper[4775]: I0127 11:52:07.707134 4775 generic.go:334] "Generic (PLEG): container finished" podID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerID="329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8" exitCode=0 Jan 27 11:52:07 crc kubenswrapper[4775]: I0127 11:52:07.707175 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerDied","Data":"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8"} Jan 27 11:52:08 crc kubenswrapper[4775]: I0127 11:52:08.716569 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerStarted","Data":"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529"} Jan 27 11:52:08 crc kubenswrapper[4775]: I0127 11:52:08.732490 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-clwvj" podStartSLOduration=2.259210566 podStartE2EDuration="4.732471654s" podCreationTimestamp="2026-01-27 11:52:04 +0000 UTC" firstStartedPulling="2026-01-27 11:52:05.688745069 +0000 UTC m=+1904.830342866" lastFinishedPulling="2026-01-27 11:52:08.162006187 +0000 UTC m=+1907.303603954" observedRunningTime="2026-01-27 11:52:08.730911503 +0000 UTC m=+1907.872509300" watchObservedRunningTime="2026-01-27 11:52:08.732471654 +0000 UTC m=+1907.874069431" Jan 27 11:52:14 crc kubenswrapper[4775]: I0127 11:52:14.418944 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:14 crc kubenswrapper[4775]: I0127 11:52:14.419417 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:14 crc kubenswrapper[4775]: I0127 11:52:14.461081 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:14 crc kubenswrapper[4775]: I0127 11:52:14.817187 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:16 crc kubenswrapper[4775]: I0127 11:52:16.483103 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:16 crc kubenswrapper[4775]: I0127 11:52:16.780128 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-clwvj" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="registry-server" containerID="cri-o://b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529" gracePeriod=2 Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.254547 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.267346 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content\") pod \"03479aab-2cb4-4bf4-b59d-399b66bdff65\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.267591 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities\") pod \"03479aab-2cb4-4bf4-b59d-399b66bdff65\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.267644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff85z\" (UniqueName: \"kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z\") pod \"03479aab-2cb4-4bf4-b59d-399b66bdff65\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.268413 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities" (OuterVolumeSpecName: "utilities") pod "03479aab-2cb4-4bf4-b59d-399b66bdff65" (UID: "03479aab-2cb4-4bf4-b59d-399b66bdff65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.310122 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z" (OuterVolumeSpecName: "kube-api-access-ff85z") pod "03479aab-2cb4-4bf4-b59d-399b66bdff65" (UID: "03479aab-2cb4-4bf4-b59d-399b66bdff65"). InnerVolumeSpecName "kube-api-access-ff85z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.369981 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.370017 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff85z\" (UniqueName: \"kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.704911 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03479aab-2cb4-4bf4-b59d-399b66bdff65" (UID: "03479aab-2cb4-4bf4-b59d-399b66bdff65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.778831 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.789299 4775 generic.go:334] "Generic (PLEG): container finished" podID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerID="b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529" exitCode=0 Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.789351 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerDied","Data":"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529"} Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.789382 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerDied","Data":"28ad4b62192fe9ba584aabf8612666c2c71c19a2a3d532eb95f5bb71b89dd6d8"} Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.789400 4775 scope.go:117] "RemoveContainer" containerID="b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.789355 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.817335 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.817550 4775 scope.go:117] "RemoveContainer" containerID="329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.826507 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.843554 4775 scope.go:117] "RemoveContainer" containerID="60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.882561 4775 scope.go:117] "RemoveContainer" containerID="b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529" Jan 27 11:52:17 crc kubenswrapper[4775]: E0127 11:52:17.883095 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529\": container with ID starting with b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529 not found: ID does not exist" containerID="b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.883156 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529"} err="failed to get container status \"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529\": rpc error: code = NotFound desc = could not find container \"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529\": container with ID starting with b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529 not found: ID does not exist" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.883185 4775 scope.go:117] "RemoveContainer" containerID="329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8" Jan 27 11:52:17 crc kubenswrapper[4775]: E0127 11:52:17.883773 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8\": container with ID starting with 329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8 not found: ID does not exist" containerID="329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.883828 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8"} err="failed to get container status \"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8\": rpc error: code = NotFound desc = could not find container \"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8\": container with ID starting with 329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8 not found: ID does not exist" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.883863 4775 scope.go:117] "RemoveContainer" containerID="60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444" Jan 27 11:52:17 crc kubenswrapper[4775]: E0127 11:52:17.884213 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444\": container with ID starting with 60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444 not found: ID does not exist" containerID="60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.884245 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444"} err="failed to get container status \"60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444\": rpc error: code = NotFound desc = could not find container \"60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444\": container with ID starting with 60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444 not found: ID does not exist" Jan 27 11:52:19 crc kubenswrapper[4775]: I0127 11:52:19.766762 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" path="/var/lib/kubelet/pods/03479aab-2cb4-4bf4-b59d-399b66bdff65/volumes" Jan 27 11:52:25 crc kubenswrapper[4775]: I0127 11:52:25.866941 4775 generic.go:334] "Generic (PLEG): container finished" podID="d002bd2d-2dcd-4ba3-841b-1306c023469b" containerID="539928bf723e33e91c76ecd68a410f9ee0c444d91e281999babed305b374d93e" exitCode=0 Jan 27 11:52:25 crc kubenswrapper[4775]: I0127 11:52:25.867036 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" event={"ID":"d002bd2d-2dcd-4ba3-841b-1306c023469b","Type":"ContainerDied","Data":"539928bf723e33e91c76ecd68a410f9ee0c444d91e281999babed305b374d93e"} Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.322294 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.376884 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.376924 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.376949 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377015 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377048 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377077 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377121 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377241 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377309 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377340 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377394 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377442 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377488 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brh28\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385153 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385202 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385309 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385406 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385813 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28" (OuterVolumeSpecName: "kube-api-access-brh28") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "kube-api-access-brh28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385847 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.392884 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.392912 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.392942 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.393001 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.405475 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.405436 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.411943 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory" (OuterVolumeSpecName: "inventory") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.418995 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479669 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479702 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479714 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479723 4775 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479733 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479741 4775 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479754 4775 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479763 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479771 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479782 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brh28\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479791 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479800 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479808 4775 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479818 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.882799 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" event={"ID":"d002bd2d-2dcd-4ba3-841b-1306c023469b","Type":"ContainerDied","Data":"b7c09cabc878d4d20c5d5e32768ca7f091ebce91f8993f1d3243f457d5f3df35"} Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.882863 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c09cabc878d4d20c5d5e32768ca7f091ebce91f8993f1d3243f457d5f3df35" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.882899 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.997788 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g"] Jan 27 11:52:27 crc kubenswrapper[4775]: E0127 11:52:27.998215 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="extract-utilities" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998235 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="extract-utilities" Jan 27 11:52:27 crc kubenswrapper[4775]: E0127 11:52:27.998251 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d002bd2d-2dcd-4ba3-841b-1306c023469b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998262 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d002bd2d-2dcd-4ba3-841b-1306c023469b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 27 11:52:27 crc kubenswrapper[4775]: E0127 11:52:27.998276 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="registry-server" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998282 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="registry-server" Jan 27 11:52:27 crc kubenswrapper[4775]: E0127 11:52:27.998295 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="extract-content" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998303 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="extract-content" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998552 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="registry-server" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998568 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d002bd2d-2dcd-4ba3-841b-1306c023469b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.999187 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.002223 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.002587 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.003061 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.003293 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.003794 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.024141 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g"] Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.093367 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.093463 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.093517 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.093591 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhqsh\" (UniqueName: \"kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.093664 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.197264 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.197679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.197720 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.197777 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhqsh\" (UniqueName: \"kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.197822 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.198537 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.214189 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.214228 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.219050 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.219793 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhqsh\" (UniqueName: \"kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.321113 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.855102 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g"] Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.894006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" event={"ID":"41359e3c-21d7-4c22-bcef-0968c2f8cca5","Type":"ContainerStarted","Data":"591b945065340d2af9ce02b0c898d46f7f3ec06efb9e2761c316ea62ece87fac"} Jan 27 11:52:29 crc kubenswrapper[4775]: I0127 11:52:29.904660 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" event={"ID":"41359e3c-21d7-4c22-bcef-0968c2f8cca5","Type":"ContainerStarted","Data":"9697a23e2dbb1b8e963d7619cfa0d83a42288fa9edbea16263fdba46daa14d3d"} Jan 27 11:52:29 crc kubenswrapper[4775]: I0127 11:52:29.925213 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" podStartSLOduration=2.453045159 podStartE2EDuration="2.925190762s" podCreationTimestamp="2026-01-27 11:52:27 +0000 UTC" firstStartedPulling="2026-01-27 11:52:28.868073706 +0000 UTC m=+1928.009671473" lastFinishedPulling="2026-01-27 11:52:29.340219289 +0000 UTC m=+1928.481817076" observedRunningTime="2026-01-27 11:52:29.922946044 +0000 UTC m=+1929.064543831" watchObservedRunningTime="2026-01-27 11:52:29.925190762 +0000 UTC m=+1929.066788539" Jan 27 11:53:35 crc kubenswrapper[4775]: I0127 11:53:35.486637 4775 generic.go:334] "Generic (PLEG): container finished" podID="41359e3c-21d7-4c22-bcef-0968c2f8cca5" containerID="9697a23e2dbb1b8e963d7619cfa0d83a42288fa9edbea16263fdba46daa14d3d" exitCode=0 Jan 27 11:53:35 crc kubenswrapper[4775]: I0127 11:53:35.486744 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" event={"ID":"41359e3c-21d7-4c22-bcef-0968c2f8cca5","Type":"ContainerDied","Data":"9697a23e2dbb1b8e963d7619cfa0d83a42288fa9edbea16263fdba46daa14d3d"} Jan 27 11:53:36 crc kubenswrapper[4775]: I0127 11:53:36.946607 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.146735 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhqsh\" (UniqueName: \"kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh\") pod \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.146797 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0\") pod \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.146827 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam\") pod \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.146879 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle\") pod \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.147098 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory\") pod \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.154786 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "41359e3c-21d7-4c22-bcef-0968c2f8cca5" (UID: "41359e3c-21d7-4c22-bcef-0968c2f8cca5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.154822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh" (OuterVolumeSpecName: "kube-api-access-mhqsh") pod "41359e3c-21d7-4c22-bcef-0968c2f8cca5" (UID: "41359e3c-21d7-4c22-bcef-0968c2f8cca5"). InnerVolumeSpecName "kube-api-access-mhqsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.172500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "41359e3c-21d7-4c22-bcef-0968c2f8cca5" (UID: "41359e3c-21d7-4c22-bcef-0968c2f8cca5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.174386 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory" (OuterVolumeSpecName: "inventory") pod "41359e3c-21d7-4c22-bcef-0968c2f8cca5" (UID: "41359e3c-21d7-4c22-bcef-0968c2f8cca5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.176410 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "41359e3c-21d7-4c22-bcef-0968c2f8cca5" (UID: "41359e3c-21d7-4c22-bcef-0968c2f8cca5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.249668 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.249701 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.249710 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhqsh\" (UniqueName: \"kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh\") on node \"crc\" DevicePath \"\"" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.249720 4775 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.249730 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.505726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" event={"ID":"41359e3c-21d7-4c22-bcef-0968c2f8cca5","Type":"ContainerDied","Data":"591b945065340d2af9ce02b0c898d46f7f3ec06efb9e2761c316ea62ece87fac"} Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.505773 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="591b945065340d2af9ce02b0c898d46f7f3ec06efb9e2761c316ea62ece87fac" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.506409 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.600747 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97"] Jan 27 11:53:37 crc kubenswrapper[4775]: E0127 11:53:37.601257 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41359e3c-21d7-4c22-bcef-0968c2f8cca5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.601283 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="41359e3c-21d7-4c22-bcef-0968c2f8cca5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.601603 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="41359e3c-21d7-4c22-bcef-0968c2f8cca5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.602615 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.609628 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.609649 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.609766 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.609796 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.609851 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.610028 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.612303 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97"] Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.763853 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.763918 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.763941 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.763970 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.764000 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.764081 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvrrj\" (UniqueName: \"kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866095 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866258 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvrrj\" (UniqueName: \"kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866340 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866386 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866491 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.869470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.869568 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.869938 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.870439 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.876145 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.882596 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvrrj\" (UniqueName: \"kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.933026 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:38 crc kubenswrapper[4775]: I0127 11:53:38.435167 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97"] Jan 27 11:53:38 crc kubenswrapper[4775]: I0127 11:53:38.515305 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" event={"ID":"352eaecd-6d51-4198-b3e6-ce59a6485be1","Type":"ContainerStarted","Data":"124efc6595a2c07c8ba8c1e21002ee587b16c0e219e4d44dc27fd782a787b0c4"} Jan 27 11:53:40 crc kubenswrapper[4775]: I0127 11:53:40.536683 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" event={"ID":"352eaecd-6d51-4198-b3e6-ce59a6485be1","Type":"ContainerStarted","Data":"0f1ef1f39ca959bb951c060526e2967aa2da81bcf3a2df13c0ff4e7031c25e4b"} Jan 27 11:53:40 crc kubenswrapper[4775]: I0127 11:53:40.578163 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" podStartSLOduration=2.561401242 podStartE2EDuration="3.578121189s" podCreationTimestamp="2026-01-27 11:53:37 +0000 UTC" firstStartedPulling="2026-01-27 11:53:38.438497157 +0000 UTC m=+1997.580094934" lastFinishedPulling="2026-01-27 11:53:39.455217104 +0000 UTC m=+1998.596814881" observedRunningTime="2026-01-27 11:53:40.55966651 +0000 UTC m=+1999.701264337" watchObservedRunningTime="2026-01-27 11:53:40.578121189 +0000 UTC m=+1999.719718986" Jan 27 11:53:59 crc kubenswrapper[4775]: I0127 11:53:59.518102 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:53:59 crc kubenswrapper[4775]: I0127 11:53:59.518799 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:54:29 crc kubenswrapper[4775]: I0127 11:54:29.517577 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:54:29 crc kubenswrapper[4775]: I0127 11:54:29.518209 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:54:30 crc kubenswrapper[4775]: I0127 11:54:30.987444 4775 generic.go:334] "Generic (PLEG): container finished" podID="352eaecd-6d51-4198-b3e6-ce59a6485be1" containerID="0f1ef1f39ca959bb951c060526e2967aa2da81bcf3a2df13c0ff4e7031c25e4b" exitCode=0 Jan 27 11:54:30 crc kubenswrapper[4775]: I0127 11:54:30.987519 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" event={"ID":"352eaecd-6d51-4198-b3e6-ce59a6485be1","Type":"ContainerDied","Data":"0f1ef1f39ca959bb951c060526e2967aa2da81bcf3a2df13c0ff4e7031c25e4b"} Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.392330 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.429498 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.430895 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.430998 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.431052 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.431106 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.431174 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvrrj\" (UniqueName: \"kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.457551 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.459843 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj" (OuterVolumeSpecName: "kube-api-access-kvrrj") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "kube-api-access-kvrrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.460652 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.462888 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.463566 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.477172 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory" (OuterVolumeSpecName: "inventory") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533073 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533232 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533302 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533363 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533434 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533542 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvrrj\" (UniqueName: \"kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.006372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" event={"ID":"352eaecd-6d51-4198-b3e6-ce59a6485be1","Type":"ContainerDied","Data":"124efc6595a2c07c8ba8c1e21002ee587b16c0e219e4d44dc27fd782a787b0c4"} Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.006415 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="124efc6595a2c07c8ba8c1e21002ee587b16c0e219e4d44dc27fd782a787b0c4" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.006491 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.165695 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm"] Jan 27 11:54:33 crc kubenswrapper[4775]: E0127 11:54:33.166275 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352eaecd-6d51-4198-b3e6-ce59a6485be1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.166304 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="352eaecd-6d51-4198-b3e6-ce59a6485be1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.166573 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="352eaecd-6d51-4198-b3e6-ce59a6485be1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.167399 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.172481 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.172791 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.173484 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.173676 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.174426 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.183927 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm"] Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.245818 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.246052 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.246150 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jthbv\" (UniqueName: \"kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.246277 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.246446 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.348731 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.349195 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.349232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.349267 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jthbv\" (UniqueName: \"kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.349316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.352924 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.354908 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.355149 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.355189 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.368415 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jthbv\" (UniqueName: \"kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.487285 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:34 crc kubenswrapper[4775]: I0127 11:54:34.029716 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm"] Jan 27 11:54:35 crc kubenswrapper[4775]: I0127 11:54:35.027706 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" event={"ID":"7ab3ce35-77fe-4e38-ad60-c5906f6d061a","Type":"ContainerStarted","Data":"1a8cf55eb3beaba1896ff94ae83a14e9b56c32210fb2f25fc6ea5ce6da28dbfb"} Jan 27 11:54:36 crc kubenswrapper[4775]: I0127 11:54:36.036390 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" event={"ID":"7ab3ce35-77fe-4e38-ad60-c5906f6d061a","Type":"ContainerStarted","Data":"9c18ce206d2ab737b472fdfd73559615373b44cbbe3b8f6f7afb1058b247290d"} Jan 27 11:54:36 crc kubenswrapper[4775]: I0127 11:54:36.061086 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" podStartSLOduration=1.960950377 podStartE2EDuration="3.061062713s" podCreationTimestamp="2026-01-27 11:54:33 +0000 UTC" firstStartedPulling="2026-01-27 11:54:34.035956102 +0000 UTC m=+2053.177553879" lastFinishedPulling="2026-01-27 11:54:35.136068438 +0000 UTC m=+2054.277666215" observedRunningTime="2026-01-27 11:54:36.053911339 +0000 UTC m=+2055.195509126" watchObservedRunningTime="2026-01-27 11:54:36.061062713 +0000 UTC m=+2055.202660490" Jan 27 11:54:59 crc kubenswrapper[4775]: I0127 11:54:59.517867 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:54:59 crc kubenswrapper[4775]: I0127 11:54:59.518501 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:54:59 crc kubenswrapper[4775]: I0127 11:54:59.518566 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:54:59 crc kubenswrapper[4775]: I0127 11:54:59.519465 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:54:59 crc kubenswrapper[4775]: I0127 11:54:59.519535 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753" gracePeriod=600 Jan 27 11:55:00 crc kubenswrapper[4775]: I0127 11:55:00.246276 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753" exitCode=0 Jan 27 11:55:00 crc kubenswrapper[4775]: I0127 11:55:00.246319 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753"} Jan 27 11:55:00 crc kubenswrapper[4775]: I0127 11:55:00.246654 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5"} Jan 27 11:55:00 crc kubenswrapper[4775]: I0127 11:55:00.246693 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:56:06 crc kubenswrapper[4775]: I0127 11:56:06.813096 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 27 11:56:11 crc kubenswrapper[4775]: I0127 11:56:11.818101 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 27 11:56:16 crc kubenswrapper[4775]: I0127 11:56:16.809657 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Jan 27 11:56:16 crc kubenswrapper[4775]: I0127 11:56:16.813334 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 27 11:56:16 crc kubenswrapper[4775]: I0127 11:56:16.813477 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Jan 27 11:56:16 crc kubenswrapper[4775]: I0127 11:56:16.815438 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"30428215fd25f2d293050de6aefc5e00ce0f54513b74c8c39065ab59e8f5dfd5"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Jan 27 11:56:16 crc kubenswrapper[4775]: I0127 11:56:16.815765 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-central-agent" containerID="cri-o://30428215fd25f2d293050de6aefc5e00ce0f54513b74c8c39065ab59e8f5dfd5" gracePeriod=30 Jan 27 11:56:22 crc kubenswrapper[4775]: I0127 11:56:22.065052 4775 generic.go:334] "Generic (PLEG): container finished" podID="f0fb6dfd-0694-418a-965e-789707762ef7" containerID="30428215fd25f2d293050de6aefc5e00ce0f54513b74c8c39065ab59e8f5dfd5" exitCode=0 Jan 27 11:56:22 crc kubenswrapper[4775]: I0127 11:56:22.065146 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerDied","Data":"30428215fd25f2d293050de6aefc5e00ce0f54513b74c8c39065ab59e8f5dfd5"} Jan 27 11:56:22 crc kubenswrapper[4775]: I0127 11:56:22.172128 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:56:24 crc kubenswrapper[4775]: I0127 11:56:24.111592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"a3a1333d65d5d593f15afb4b0f08e508a777bb7a8596b72b5e166d8e425466e1"} Jan 27 11:56:59 crc kubenswrapper[4775]: I0127 11:56:59.517597 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:56:59 crc kubenswrapper[4775]: I0127 11:56:59.518423 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.495987 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.500980 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.512469 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.602219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.602308 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.602364 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgtkp\" (UniqueName: \"kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.704756 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.704807 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.704856 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgtkp\" (UniqueName: \"kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.705430 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.705493 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.725374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgtkp\" (UniqueName: \"kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.878200 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:04 crc kubenswrapper[4775]: I0127 11:57:04.390316 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:57:04 crc kubenswrapper[4775]: I0127 11:57:04.539956 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerStarted","Data":"0291753de9c98da2bae990b058c91775c37c615ad32e74c62d7d9edcc6e28728"} Jan 27 11:57:16 crc kubenswrapper[4775]: I0127 11:57:16.813040 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Jan 27 11:57:29 crc kubenswrapper[4775]: I0127 11:57:29.517655 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:57:29 crc kubenswrapper[4775]: I0127 11:57:29.518438 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.189537 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.191946 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.206173 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.328725 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bjj6\" (UniqueName: \"kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.329106 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.329215 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.431717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bjj6\" (UniqueName: \"kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.431801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.431877 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.432732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.432786 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.462363 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bjj6\" (UniqueName: \"kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.516408 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:32 crc kubenswrapper[4775]: I0127 11:57:32.068208 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:57:32 crc kubenswrapper[4775]: I0127 11:57:32.894627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerStarted","Data":"0aa60fb56c0c1bbe7a3921142085582cfc65147743088dfa80a0c2dcd32c2888"} Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.569986 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.573290 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.586535 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.621947 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm2qt\" (UniqueName: \"kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.622059 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.622090 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.724061 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.724118 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.724272 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2qt\" (UniqueName: \"kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.725076 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.725100 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.747077 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2qt\" (UniqueName: \"kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.893903 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:36 crc kubenswrapper[4775]: I0127 11:57:36.205831 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:57:36 crc kubenswrapper[4775]: I0127 11:57:36.932160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerStarted","Data":"d1e46eda680fc26f0fdf41433ba80f7398beded8cdfb1da42d96181b8897822d"} Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.970360 4775 generic.go:334] "Generic (PLEG): container finished" podID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerID="18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64" exitCode=0 Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.970669 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerDied","Data":"18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64"} Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.973845 4775 generic.go:334] "Generic (PLEG): container finished" podID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerID="63ad6895a072ae0f6274d71feae46f6b9e7b51d511d23402a809bd3efe084043" exitCode=0 Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.973939 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerDied","Data":"63ad6895a072ae0f6274d71feae46f6b9e7b51d511d23402a809bd3efe084043"} Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.976870 4775 generic.go:334] "Generic (PLEG): container finished" podID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerID="a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76" exitCode=0 Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.976899 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerDied","Data":"a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76"} Jan 27 11:57:41 crc kubenswrapper[4775]: I0127 11:57:41.630050 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-notification-agent" probeResult="failure" output=< Jan 27 11:57:41 crc kubenswrapper[4775]: Unkown error: Expecting value: line 1 column 1 (char 0) Jan 27 11:57:41 crc kubenswrapper[4775]: > Jan 27 11:57:41 crc kubenswrapper[4775]: I0127 11:57:41.996782 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerStarted","Data":"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003"} Jan 27 11:57:42 crc kubenswrapper[4775]: I0127 11:57:42.010299 4775 generic.go:334] "Generic (PLEG): container finished" podID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerID="3f41f77137616cfc9a29698c06344fd248206ddb45a7e5ae68f078f59add4d33" exitCode=0 Jan 27 11:57:42 crc kubenswrapper[4775]: I0127 11:57:42.010467 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerDied","Data":"3f41f77137616cfc9a29698c06344fd248206ddb45a7e5ae68f078f59add4d33"} Jan 27 11:57:42 crc kubenswrapper[4775]: I0127 11:57:42.017522 4775 generic.go:334] "Generic (PLEG): container finished" podID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerID="cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232" exitCode=0 Jan 27 11:57:42 crc kubenswrapper[4775]: I0127 11:57:42.017613 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerDied","Data":"cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232"} Jan 27 11:57:43 crc kubenswrapper[4775]: I0127 11:57:43.029491 4775 generic.go:334] "Generic (PLEG): container finished" podID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerID="101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003" exitCode=0 Jan 27 11:57:43 crc kubenswrapper[4775]: I0127 11:57:43.029541 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerDied","Data":"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003"} Jan 27 11:57:49 crc kubenswrapper[4775]: I0127 11:57:49.812792 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" podUID="fae72616-e516-4ce6-86b8-b28f14a92939" containerName="sbdb" probeResult="failure" output="command timed out" Jan 27 11:57:49 crc kubenswrapper[4775]: I0127 11:57:49.813926 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" podUID="fae72616-e516-4ce6-86b8-b28f14a92939" containerName="nbdb" probeResult="failure" output="command timed out" Jan 27 11:57:59 crc kubenswrapper[4775]: I0127 11:57:59.517481 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:57:59 crc kubenswrapper[4775]: I0127 11:57:59.518118 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:57:59 crc kubenswrapper[4775]: I0127 11:57:59.518169 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:57:59 crc kubenswrapper[4775]: I0127 11:57:59.519055 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:57:59 crc kubenswrapper[4775]: I0127 11:57:59.519115 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" gracePeriod=600 Jan 27 11:58:00 crc kubenswrapper[4775]: E0127 11:58:00.155309 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:58:00 crc kubenswrapper[4775]: I0127 11:58:00.180799 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" exitCode=0 Jan 27 11:58:00 crc kubenswrapper[4775]: I0127 11:58:00.180955 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5"} Jan 27 11:58:00 crc kubenswrapper[4775]: I0127 11:58:00.181148 4775 scope.go:117] "RemoveContainer" containerID="a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753" Jan 27 11:58:00 crc kubenswrapper[4775]: I0127 11:58:00.182016 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:58:00 crc kubenswrapper[4775]: E0127 11:58:00.182402 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.193679 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerStarted","Data":"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9"} Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.196085 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerStarted","Data":"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2"} Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.197996 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerStarted","Data":"267f2badda022e36c2d1823938582d163ff75cd57ead4dfe95643ae7950e1308"} Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.219293 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rw88w" podStartSLOduration=9.953232602 podStartE2EDuration="30.219272397s" podCreationTimestamp="2026-01-27 11:57:31 +0000 UTC" firstStartedPulling="2026-01-27 11:57:39.977914957 +0000 UTC m=+2239.119512734" lastFinishedPulling="2026-01-27 11:58:00.243954752 +0000 UTC m=+2259.385552529" observedRunningTime="2026-01-27 11:58:01.216895592 +0000 UTC m=+2260.358493389" watchObservedRunningTime="2026-01-27 11:58:01.219272397 +0000 UTC m=+2260.360870174" Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.238034 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n28vv" podStartSLOduration=6.0854180509999996 podStartE2EDuration="26.238015313s" podCreationTimestamp="2026-01-27 11:57:35 +0000 UTC" firstStartedPulling="2026-01-27 11:57:39.972373648 +0000 UTC m=+2239.113971425" lastFinishedPulling="2026-01-27 11:58:00.12497091 +0000 UTC m=+2259.266568687" observedRunningTime="2026-01-27 11:58:01.234850447 +0000 UTC m=+2260.376448224" watchObservedRunningTime="2026-01-27 11:58:01.238015313 +0000 UTC m=+2260.379613090" Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.254292 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8xp2v" podStartSLOduration=38.008557906 podStartE2EDuration="58.254270932s" podCreationTimestamp="2026-01-27 11:57:03 +0000 UTC" firstStartedPulling="2026-01-27 11:57:39.976397057 +0000 UTC m=+2239.117994834" lastFinishedPulling="2026-01-27 11:58:00.222110083 +0000 UTC m=+2259.363707860" observedRunningTime="2026-01-27 11:58:01.254250651 +0000 UTC m=+2260.395848438" watchObservedRunningTime="2026-01-27 11:58:01.254270932 +0000 UTC m=+2260.395868709" Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.516937 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.516988 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:02 crc kubenswrapper[4775]: I0127 11:58:02.559835 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rw88w" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="registry-server" probeResult="failure" output=< Jan 27 11:58:02 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:58:02 crc kubenswrapper[4775]: > Jan 27 11:58:03 crc kubenswrapper[4775]: I0127 11:58:03.879487 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:03 crc kubenswrapper[4775]: I0127 11:58:03.879572 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:03 crc kubenswrapper[4775]: I0127 11:58:03.932329 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:05 crc kubenswrapper[4775]: I0127 11:58:05.894231 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:05 crc kubenswrapper[4775]: I0127 11:58:05.895832 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:06 crc kubenswrapper[4775]: I0127 11:58:06.937029 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n28vv" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" probeResult="failure" output=< Jan 27 11:58:06 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:58:06 crc kubenswrapper[4775]: > Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.563661 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.613195 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.647845 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-notification-agent" probeResult="failure" output=< Jan 27 11:58:11 crc kubenswrapper[4775]: Unkown error: Expecting value: line 1 column 1 (char 0) Jan 27 11:58:11 crc kubenswrapper[4775]: > Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.647931 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.649549 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-notification-agent" containerStatusID={"Type":"cri-o","ID":"aea3181cf116bae455f41b1366597b119efc1371f74ffae26f9a4168156cbb13"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-notification-agent failed liveness probe, will be restarted" Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.649624 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-notification-agent" containerID="cri-o://aea3181cf116bae455f41b1366597b119efc1371f74ffae26f9a4168156cbb13" gracePeriod=30 Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.806934 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.313049 4775 generic.go:334] "Generic (PLEG): container finished" podID="f0fb6dfd-0694-418a-965e-789707762ef7" containerID="aea3181cf116bae455f41b1366597b119efc1371f74ffae26f9a4168156cbb13" exitCode=0 Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.313135 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerDied","Data":"aea3181cf116bae455f41b1366597b119efc1371f74ffae26f9a4168156cbb13"} Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.313436 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rw88w" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="registry-server" containerID="cri-o://6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9" gracePeriod=2 Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.795930 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.937103 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.979322 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities\") pod \"0c6c0e22-40c5-460b-bd26-97757534ba57\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.979614 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bjj6\" (UniqueName: \"kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6\") pod \"0c6c0e22-40c5-460b-bd26-97757534ba57\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.979753 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content\") pod \"0c6c0e22-40c5-460b-bd26-97757534ba57\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.979995 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities" (OuterVolumeSpecName: "utilities") pod "0c6c0e22-40c5-460b-bd26-97757534ba57" (UID: "0c6c0e22-40c5-460b-bd26-97757534ba57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.981030 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.989819 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6" (OuterVolumeSpecName: "kube-api-access-4bjj6") pod "0c6c0e22-40c5-460b-bd26-97757534ba57" (UID: "0c6c0e22-40c5-460b-bd26-97757534ba57"). InnerVolumeSpecName "kube-api-access-4bjj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.027279 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c6c0e22-40c5-460b-bd26-97757534ba57" (UID: "0c6c0e22-40c5-460b-bd26-97757534ba57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.083129 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bjj6\" (UniqueName: \"kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.083360 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.330427 4775 generic.go:334] "Generic (PLEG): container finished" podID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerID="6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9" exitCode=0 Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.330480 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.330496 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerDied","Data":"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9"} Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.330979 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerDied","Data":"0aa60fb56c0c1bbe7a3921142085582cfc65147743088dfa80a0c2dcd32c2888"} Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.331008 4775 scope.go:117] "RemoveContainer" containerID="6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.341987 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"4aa92910d4fd30d50d0a609af387150a1c8121886282da00160ed4a2d0b4ef35"} Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.357066 4775 scope.go:117] "RemoveContainer" containerID="cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.410212 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.411051 4775 scope.go:117] "RemoveContainer" containerID="a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.420339 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.438680 4775 scope.go:117] "RemoveContainer" containerID="6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9" Jan 27 11:58:14 crc kubenswrapper[4775]: E0127 11:58:14.439393 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9\": container with ID starting with 6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9 not found: ID does not exist" containerID="6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.439441 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9"} err="failed to get container status \"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9\": rpc error: code = NotFound desc = could not find container \"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9\": container with ID starting with 6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9 not found: ID does not exist" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.439504 4775 scope.go:117] "RemoveContainer" containerID="cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232" Jan 27 11:58:14 crc kubenswrapper[4775]: E0127 11:58:14.440085 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232\": container with ID starting with cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232 not found: ID does not exist" containerID="cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.440153 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232"} err="failed to get container status \"cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232\": rpc error: code = NotFound desc = could not find container \"cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232\": container with ID starting with cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232 not found: ID does not exist" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.440202 4775 scope.go:117] "RemoveContainer" containerID="a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76" Jan 27 11:58:14 crc kubenswrapper[4775]: E0127 11:58:14.440723 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76\": container with ID starting with a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76 not found: ID does not exist" containerID="a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.440750 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76"} err="failed to get container status \"a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76\": rpc error: code = NotFound desc = could not find container \"a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76\": container with ID starting with a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76 not found: ID does not exist" Jan 27 11:58:15 crc kubenswrapper[4775]: I0127 11:58:15.745528 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:58:15 crc kubenswrapper[4775]: E0127 11:58:15.745785 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:58:15 crc kubenswrapper[4775]: I0127 11:58:15.756526 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" path="/var/lib/kubelet/pods/0c6c0e22-40c5-460b-bd26-97757534ba57/volumes" Jan 27 11:58:15 crc kubenswrapper[4775]: I0127 11:58:15.812779 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:58:15 crc kubenswrapper[4775]: I0127 11:58:15.813055 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8xp2v" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="registry-server" containerID="cri-o://267f2badda022e36c2d1823938582d163ff75cd57ead4dfe95643ae7950e1308" gracePeriod=2 Jan 27 11:58:16 crc kubenswrapper[4775]: I0127 11:58:16.362872 4775 generic.go:334] "Generic (PLEG): container finished" podID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerID="267f2badda022e36c2d1823938582d163ff75cd57ead4dfe95643ae7950e1308" exitCode=0 Jan 27 11:58:16 crc kubenswrapper[4775]: I0127 11:58:16.362908 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerDied","Data":"267f2badda022e36c2d1823938582d163ff75cd57ead4dfe95643ae7950e1308"} Jan 27 11:58:16 crc kubenswrapper[4775]: I0127 11:58:16.904287 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:16 crc kubenswrapper[4775]: I0127 11:58:16.950526 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n28vv" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" probeResult="failure" output=< Jan 27 11:58:16 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:58:16 crc kubenswrapper[4775]: > Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.040756 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgtkp\" (UniqueName: \"kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp\") pod \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.040878 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content\") pod \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.040978 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities\") pod \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.041749 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities" (OuterVolumeSpecName: "utilities") pod "1f75ce6e-5d6d-4f5c-8ac9-803632a916da" (UID: "1f75ce6e-5d6d-4f5c-8ac9-803632a916da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.047089 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp" (OuterVolumeSpecName: "kube-api-access-zgtkp") pod "1f75ce6e-5d6d-4f5c-8ac9-803632a916da" (UID: "1f75ce6e-5d6d-4f5c-8ac9-803632a916da"). InnerVolumeSpecName "kube-api-access-zgtkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.067136 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f75ce6e-5d6d-4f5c-8ac9-803632a916da" (UID: "1f75ce6e-5d6d-4f5c-8ac9-803632a916da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.143117 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgtkp\" (UniqueName: \"kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.143158 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.143171 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.378779 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerDied","Data":"0291753de9c98da2bae990b058c91775c37c615ad32e74c62d7d9edcc6e28728"} Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.378864 4775 scope.go:117] "RemoveContainer" containerID="267f2badda022e36c2d1823938582d163ff75cd57ead4dfe95643ae7950e1308" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.379068 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.419935 4775 scope.go:117] "RemoveContainer" containerID="3f41f77137616cfc9a29698c06344fd248206ddb45a7e5ae68f078f59add4d33" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.422443 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.431231 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.440317 4775 scope.go:117] "RemoveContainer" containerID="63ad6895a072ae0f6274d71feae46f6b9e7b51d511d23402a809bd3efe084043" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.755282 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" path="/var/lib/kubelet/pods/1f75ce6e-5d6d-4f5c-8ac9-803632a916da/volumes" Jan 27 11:58:25 crc kubenswrapper[4775]: I0127 11:58:25.953497 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:26 crc kubenswrapper[4775]: I0127 11:58:26.024954 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:26 crc kubenswrapper[4775]: I0127 11:58:26.085425 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:58:27 crc kubenswrapper[4775]: I0127 11:58:27.471516 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n28vv" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" containerID="cri-o://d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2" gracePeriod=2 Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.451651 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.495193 4775 generic.go:334] "Generic (PLEG): container finished" podID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerID="d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2" exitCode=0 Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.495239 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerDied","Data":"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2"} Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.495303 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerDied","Data":"d1e46eda680fc26f0fdf41433ba80f7398beded8cdfb1da42d96181b8897822d"} Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.495325 4775 scope.go:117] "RemoveContainer" containerID="d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.495494 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.518935 4775 scope.go:117] "RemoveContainer" containerID="101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.551695 4775 scope.go:117] "RemoveContainer" containerID="18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.566621 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities\") pod \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.567328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content\") pod \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.567540 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm2qt\" (UniqueName: \"kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt\") pod \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.570134 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities" (OuterVolumeSpecName: "utilities") pod "3fbf29b1-8a03-401a-99b3-7e5e6334036b" (UID: "3fbf29b1-8a03-401a-99b3-7e5e6334036b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.580837 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt" (OuterVolumeSpecName: "kube-api-access-pm2qt") pod "3fbf29b1-8a03-401a-99b3-7e5e6334036b" (UID: "3fbf29b1-8a03-401a-99b3-7e5e6334036b"). InnerVolumeSpecName "kube-api-access-pm2qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.624824 4775 scope.go:117] "RemoveContainer" containerID="d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2" Jan 27 11:58:28 crc kubenswrapper[4775]: E0127 11:58:28.628661 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2\": container with ID starting with d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2 not found: ID does not exist" containerID="d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.628747 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2"} err="failed to get container status \"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2\": rpc error: code = NotFound desc = could not find container \"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2\": container with ID starting with d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2 not found: ID does not exist" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.628794 4775 scope.go:117] "RemoveContainer" containerID="101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003" Jan 27 11:58:28 crc kubenswrapper[4775]: E0127 11:58:28.629292 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003\": container with ID starting with 101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003 not found: ID does not exist" containerID="101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.629326 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003"} err="failed to get container status \"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003\": rpc error: code = NotFound desc = could not find container \"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003\": container with ID starting with 101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003 not found: ID does not exist" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.629347 4775 scope.go:117] "RemoveContainer" containerID="18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64" Jan 27 11:58:28 crc kubenswrapper[4775]: E0127 11:58:28.629753 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64\": container with ID starting with 18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64 not found: ID does not exist" containerID="18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.629805 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64"} err="failed to get container status \"18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64\": rpc error: code = NotFound desc = could not find container \"18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64\": container with ID starting with 18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64 not found: ID does not exist" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.671230 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.671277 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm2qt\" (UniqueName: \"kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.712951 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fbf29b1-8a03-401a-99b3-7e5e6334036b" (UID: "3fbf29b1-8a03-401a-99b3-7e5e6334036b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.773071 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.841856 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.852060 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:58:29 crc kubenswrapper[4775]: I0127 11:58:29.746135 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:58:29 crc kubenswrapper[4775]: E0127 11:58:29.746508 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:58:29 crc kubenswrapper[4775]: I0127 11:58:29.756533 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" path="/var/lib/kubelet/pods/3fbf29b1-8a03-401a-99b3-7e5e6334036b/volumes" Jan 27 11:58:40 crc kubenswrapper[4775]: I0127 11:58:40.744976 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:58:40 crc kubenswrapper[4775]: E0127 11:58:40.746410 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:58:51 crc kubenswrapper[4775]: I0127 11:58:51.750938 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:58:51 crc kubenswrapper[4775]: E0127 11:58:51.751817 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:59:02 crc kubenswrapper[4775]: I0127 11:59:02.745011 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:59:02 crc kubenswrapper[4775]: E0127 11:59:02.745974 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:59:14 crc kubenswrapper[4775]: I0127 11:59:14.744827 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:59:14 crc kubenswrapper[4775]: E0127 11:59:14.745728 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:59:28 crc kubenswrapper[4775]: I0127 11:59:28.744781 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:59:28 crc kubenswrapper[4775]: E0127 11:59:28.745570 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:59:41 crc kubenswrapper[4775]: I0127 11:59:41.760826 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:59:41 crc kubenswrapper[4775]: E0127 11:59:41.761791 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:59:53 crc kubenswrapper[4775]: I0127 11:59:53.747660 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:59:53 crc kubenswrapper[4775]: E0127 11:59:53.748430 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.156247 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts"] Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157428 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157443 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157479 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157488 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157513 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157521 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157535 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157542 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157556 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157563 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157600 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157608 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157629 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157637 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157658 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157666 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157678 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157695 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157908 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157928 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157950 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.158829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.162024 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.165205 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.183331 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts"] Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.341412 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgnnt\" (UniqueName: \"kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.341516 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.341656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.443382 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgnnt\" (UniqueName: \"kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.443499 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.443534 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.445418 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.451016 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.459770 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgnnt\" (UniqueName: \"kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.489801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.918452 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts"] Jan 27 12:00:01 crc kubenswrapper[4775]: I0127 12:00:01.315562 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" event={"ID":"5c054560-ca6e-4a4f-8116-df9beff95ec2","Type":"ContainerStarted","Data":"f8ee5ec2a726ff9b461ea7b1e669ac7406fde32c15888e6e019d6dcdec7fc3a5"} Jan 27 12:00:01 crc kubenswrapper[4775]: I0127 12:00:01.316648 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" event={"ID":"5c054560-ca6e-4a4f-8116-df9beff95ec2","Type":"ContainerStarted","Data":"a72d4ae538fb5c49345cefca37ff75cc77a0f4e50afcf8e9121cb652ee642d88"} Jan 27 12:00:01 crc kubenswrapper[4775]: I0127 12:00:01.335676 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" podStartSLOduration=1.335660659 podStartE2EDuration="1.335660659s" podCreationTimestamp="2026-01-27 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 12:00:01.334054504 +0000 UTC m=+2380.475652281" watchObservedRunningTime="2026-01-27 12:00:01.335660659 +0000 UTC m=+2380.477258436" Jan 27 12:00:02 crc kubenswrapper[4775]: I0127 12:00:02.325502 4775 generic.go:334] "Generic (PLEG): container finished" podID="5c054560-ca6e-4a4f-8116-df9beff95ec2" containerID="f8ee5ec2a726ff9b461ea7b1e669ac7406fde32c15888e6e019d6dcdec7fc3a5" exitCode=0 Jan 27 12:00:02 crc kubenswrapper[4775]: I0127 12:00:02.325606 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" event={"ID":"5c054560-ca6e-4a4f-8116-df9beff95ec2","Type":"ContainerDied","Data":"f8ee5ec2a726ff9b461ea7b1e669ac7406fde32c15888e6e019d6dcdec7fc3a5"} Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.608072 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.699750 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume\") pod \"5c054560-ca6e-4a4f-8116-df9beff95ec2\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.700194 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgnnt\" (UniqueName: \"kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt\") pod \"5c054560-ca6e-4a4f-8116-df9beff95ec2\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.701063 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume\") pod \"5c054560-ca6e-4a4f-8116-df9beff95ec2\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.701598 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume" (OuterVolumeSpecName: "config-volume") pod "5c054560-ca6e-4a4f-8116-df9beff95ec2" (UID: "5c054560-ca6e-4a4f-8116-df9beff95ec2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.702337 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.706553 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt" (OuterVolumeSpecName: "kube-api-access-bgnnt") pod "5c054560-ca6e-4a4f-8116-df9beff95ec2" (UID: "5c054560-ca6e-4a4f-8116-df9beff95ec2"). InnerVolumeSpecName "kube-api-access-bgnnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.706653 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5c054560-ca6e-4a4f-8116-df9beff95ec2" (UID: "5c054560-ca6e-4a4f-8116-df9beff95ec2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.804310 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.804350 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgnnt\" (UniqueName: \"kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:04 crc kubenswrapper[4775]: I0127 12:00:04.348900 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" event={"ID":"5c054560-ca6e-4a4f-8116-df9beff95ec2","Type":"ContainerDied","Data":"a72d4ae538fb5c49345cefca37ff75cc77a0f4e50afcf8e9121cb652ee642d88"} Jan 27 12:00:04 crc kubenswrapper[4775]: I0127 12:00:04.348968 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:04 crc kubenswrapper[4775]: I0127 12:00:04.349501 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a72d4ae538fb5c49345cefca37ff75cc77a0f4e50afcf8e9121cb652ee642d88" Jan 27 12:00:04 crc kubenswrapper[4775]: I0127 12:00:04.416076 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv"] Jan 27 12:00:04 crc kubenswrapper[4775]: I0127 12:00:04.425618 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv"] Jan 27 12:00:05 crc kubenswrapper[4775]: I0127 12:00:05.745882 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:00:05 crc kubenswrapper[4775]: E0127 12:00:05.746415 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:00:05 crc kubenswrapper[4775]: I0127 12:00:05.764071 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04906ea0-5e8b-4e8b-8f20-c46587da8346" path="/var/lib/kubelet/pods/04906ea0-5e8b-4e8b-8f20-c46587da8346/volumes" Jan 27 12:00:16 crc kubenswrapper[4775]: I0127 12:00:16.745139 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:00:16 crc kubenswrapper[4775]: E0127 12:00:16.745920 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:00:27 crc kubenswrapper[4775]: I0127 12:00:27.744807 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:00:27 crc kubenswrapper[4775]: E0127 12:00:27.745651 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:00:40 crc kubenswrapper[4775]: I0127 12:00:40.745966 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:00:40 crc kubenswrapper[4775]: E0127 12:00:40.747081 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:00:41 crc kubenswrapper[4775]: I0127 12:00:41.669475 4775 generic.go:334] "Generic (PLEG): container finished" podID="7ab3ce35-77fe-4e38-ad60-c5906f6d061a" containerID="9c18ce206d2ab737b472fdfd73559615373b44cbbe3b8f6f7afb1058b247290d" exitCode=0 Jan 27 12:00:41 crc kubenswrapper[4775]: I0127 12:00:41.669584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" event={"ID":"7ab3ce35-77fe-4e38-ad60-c5906f6d061a","Type":"ContainerDied","Data":"9c18ce206d2ab737b472fdfd73559615373b44cbbe3b8f6f7afb1058b247290d"} Jan 27 12:00:42 crc kubenswrapper[4775]: I0127 12:00:42.166951 4775 scope.go:117] "RemoveContainer" containerID="bee43132c84a9e322e462c0d4b4b214665e4a0e6c90cb849008c237820eb6817" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.150122 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.341842 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle\") pod \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.341904 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0\") pod \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.341934 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jthbv\" (UniqueName: \"kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv\") pod \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.342203 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory\") pod \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.342235 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam\") pod \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.354735 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7ab3ce35-77fe-4e38-ad60-c5906f6d061a" (UID: "7ab3ce35-77fe-4e38-ad60-c5906f6d061a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.354837 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv" (OuterVolumeSpecName: "kube-api-access-jthbv") pod "7ab3ce35-77fe-4e38-ad60-c5906f6d061a" (UID: "7ab3ce35-77fe-4e38-ad60-c5906f6d061a"). InnerVolumeSpecName "kube-api-access-jthbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.379879 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory" (OuterVolumeSpecName: "inventory") pod "7ab3ce35-77fe-4e38-ad60-c5906f6d061a" (UID: "7ab3ce35-77fe-4e38-ad60-c5906f6d061a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.379962 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7ab3ce35-77fe-4e38-ad60-c5906f6d061a" (UID: "7ab3ce35-77fe-4e38-ad60-c5906f6d061a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.380314 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7ab3ce35-77fe-4e38-ad60-c5906f6d061a" (UID: "7ab3ce35-77fe-4e38-ad60-c5906f6d061a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.444392 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.444438 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.444465 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.444474 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jthbv\" (UniqueName: \"kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.444484 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.691544 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" event={"ID":"7ab3ce35-77fe-4e38-ad60-c5906f6d061a","Type":"ContainerDied","Data":"1a8cf55eb3beaba1896ff94ae83a14e9b56c32210fb2f25fc6ea5ce6da28dbfb"} Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.691596 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a8cf55eb3beaba1896ff94ae83a14e9b56c32210fb2f25fc6ea5ce6da28dbfb" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.691620 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.808924 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2"] Jan 27 12:00:43 crc kubenswrapper[4775]: E0127 12:00:43.809516 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab3ce35-77fe-4e38-ad60-c5906f6d061a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.809540 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab3ce35-77fe-4e38-ad60-c5906f6d061a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 27 12:00:43 crc kubenswrapper[4775]: E0127 12:00:43.809564 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c054560-ca6e-4a4f-8116-df9beff95ec2" containerName="collect-profiles" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.809573 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c054560-ca6e-4a4f-8116-df9beff95ec2" containerName="collect-profiles" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.809823 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c054560-ca6e-4a4f-8116-df9beff95ec2" containerName="collect-profiles" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.809846 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab3ce35-77fe-4e38-ad60-c5906f6d061a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.810628 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.813437 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.813460 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.813665 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.814283 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.814479 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.814688 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.816727 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.822004 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2"] Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954212 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954291 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz55c\" (UniqueName: \"kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954341 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954368 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954437 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954485 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954511 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954574 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954597 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.055960 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.056033 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz55c\" (UniqueName: \"kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.056080 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.056108 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.056177 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.057301 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.057482 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.057532 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.057613 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.057637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.062371 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.063138 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.063756 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.064074 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.064246 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.065197 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.066050 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.072608 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz55c\" (UniqueName: \"kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.139070 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.763298 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2"] Jan 27 12:00:45 crc kubenswrapper[4775]: I0127 12:00:45.715084 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" event={"ID":"36bee79d-4a97-407b-9907-87d740929ba0","Type":"ContainerStarted","Data":"c0f0e60fd0308f16eb0ba529574f44937b12975ddccb5ca3ee21176c89d848a1"} Jan 27 12:00:46 crc kubenswrapper[4775]: I0127 12:00:46.737917 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" event={"ID":"36bee79d-4a97-407b-9907-87d740929ba0","Type":"ContainerStarted","Data":"0d10c4e2f56c82fb2bd1a827f287080bf4533c4f837f84b4f88c1945f15b20ca"} Jan 27 12:00:46 crc kubenswrapper[4775]: I0127 12:00:46.798259 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" podStartSLOduration=2.823315264 podStartE2EDuration="3.798222252s" podCreationTimestamp="2026-01-27 12:00:43 +0000 UTC" firstStartedPulling="2026-01-27 12:00:44.766127084 +0000 UTC m=+2423.907724861" lastFinishedPulling="2026-01-27 12:00:45.741034072 +0000 UTC m=+2424.882631849" observedRunningTime="2026-01-27 12:00:46.779326306 +0000 UTC m=+2425.920924103" watchObservedRunningTime="2026-01-27 12:00:46.798222252 +0000 UTC m=+2425.939820029" Jan 27 12:00:55 crc kubenswrapper[4775]: I0127 12:00:55.745586 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:00:55 crc kubenswrapper[4775]: E0127 12:00:55.746820 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.142463 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29491921-2bnsm"] Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.143770 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.161246 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29491921-2bnsm"] Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.217289 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppd4m\" (UniqueName: \"kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.217387 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.217472 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.217532 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.319606 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppd4m\" (UniqueName: \"kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.319670 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.319715 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.319754 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.326696 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.336538 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.337212 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.339389 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppd4m\" (UniqueName: \"kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.474500 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.902303 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29491921-2bnsm"] Jan 27 12:01:00 crc kubenswrapper[4775]: W0127 12:01:00.908994 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ce874bb_50b0_4a56_a322_f5590c1d19bd.slice/crio-865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98 WatchSource:0}: Error finding container 865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98: Status 404 returned error can't find the container with id 865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98 Jan 27 12:01:01 crc kubenswrapper[4775]: I0127 12:01:01.902803 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29491921-2bnsm" event={"ID":"5ce874bb-50b0-4a56-a322-f5590c1d19bd","Type":"ContainerStarted","Data":"9f0c98731d688a69835a1ca60f98f70ee8474a138e88b3f3711d909f34cc3985"} Jan 27 12:01:01 crc kubenswrapper[4775]: I0127 12:01:01.903464 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29491921-2bnsm" event={"ID":"5ce874bb-50b0-4a56-a322-f5590c1d19bd","Type":"ContainerStarted","Data":"865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98"} Jan 27 12:01:01 crc kubenswrapper[4775]: I0127 12:01:01.923471 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29491921-2bnsm" podStartSLOduration=1.9234339280000001 podStartE2EDuration="1.923433928s" podCreationTimestamp="2026-01-27 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 12:01:01.921799695 +0000 UTC m=+2441.063397502" watchObservedRunningTime="2026-01-27 12:01:01.923433928 +0000 UTC m=+2441.065031705" Jan 27 12:01:03 crc kubenswrapper[4775]: I0127 12:01:03.925445 4775 generic.go:334] "Generic (PLEG): container finished" podID="5ce874bb-50b0-4a56-a322-f5590c1d19bd" containerID="9f0c98731d688a69835a1ca60f98f70ee8474a138e88b3f3711d909f34cc3985" exitCode=0 Jan 27 12:01:03 crc kubenswrapper[4775]: I0127 12:01:03.925516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29491921-2bnsm" event={"ID":"5ce874bb-50b0-4a56-a322-f5590c1d19bd","Type":"ContainerDied","Data":"9f0c98731d688a69835a1ca60f98f70ee8474a138e88b3f3711d909f34cc3985"} Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.263007 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.316089 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppd4m\" (UniqueName: \"kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m\") pod \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.316195 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle\") pod \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.316247 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys\") pod \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.316341 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data\") pod \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.322849 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5ce874bb-50b0-4a56-a322-f5590c1d19bd" (UID: "5ce874bb-50b0-4a56-a322-f5590c1d19bd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.333443 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m" (OuterVolumeSpecName: "kube-api-access-ppd4m") pod "5ce874bb-50b0-4a56-a322-f5590c1d19bd" (UID: "5ce874bb-50b0-4a56-a322-f5590c1d19bd"). InnerVolumeSpecName "kube-api-access-ppd4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.347234 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ce874bb-50b0-4a56-a322-f5590c1d19bd" (UID: "5ce874bb-50b0-4a56-a322-f5590c1d19bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.383848 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data" (OuterVolumeSpecName: "config-data") pod "5ce874bb-50b0-4a56-a322-f5590c1d19bd" (UID: "5ce874bb-50b0-4a56-a322-f5590c1d19bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.418223 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.418256 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppd4m\" (UniqueName: \"kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m\") on node \"crc\" DevicePath \"\"" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.418269 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.418277 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.943782 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29491921-2bnsm" event={"ID":"5ce874bb-50b0-4a56-a322-f5590c1d19bd","Type":"ContainerDied","Data":"865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98"} Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.943847 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.943867 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:06 crc kubenswrapper[4775]: I0127 12:01:06.745434 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:01:06 crc kubenswrapper[4775]: E0127 12:01:06.745738 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:01:19 crc kubenswrapper[4775]: I0127 12:01:19.745821 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:01:19 crc kubenswrapper[4775]: E0127 12:01:19.746832 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:01:33 crc kubenswrapper[4775]: I0127 12:01:33.745816 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:01:33 crc kubenswrapper[4775]: E0127 12:01:33.746969 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:01:46 crc kubenswrapper[4775]: I0127 12:01:46.744598 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:01:46 crc kubenswrapper[4775]: E0127 12:01:46.746387 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:02:00 crc kubenswrapper[4775]: I0127 12:02:00.744951 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:02:00 crc kubenswrapper[4775]: E0127 12:02:00.745852 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:02:11 crc kubenswrapper[4775]: I0127 12:02:11.752330 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:02:11 crc kubenswrapper[4775]: E0127 12:02:11.753369 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:02:25 crc kubenswrapper[4775]: I0127 12:02:25.745519 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:02:25 crc kubenswrapper[4775]: E0127 12:02:25.746780 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:02:38 crc kubenswrapper[4775]: I0127 12:02:38.745710 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:02:38 crc kubenswrapper[4775]: E0127 12:02:38.746419 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:02:50 crc kubenswrapper[4775]: I0127 12:02:50.745421 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:02:50 crc kubenswrapper[4775]: E0127 12:02:50.746296 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:03:01 crc kubenswrapper[4775]: I0127 12:03:01.754535 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:03:02 crc kubenswrapper[4775]: I0127 12:03:02.938973 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365"} Jan 27 12:03:04 crc kubenswrapper[4775]: I0127 12:03:04.962599 4775 generic.go:334] "Generic (PLEG): container finished" podID="36bee79d-4a97-407b-9907-87d740929ba0" containerID="0d10c4e2f56c82fb2bd1a827f287080bf4533c4f837f84b4f88c1945f15b20ca" exitCode=0 Jan 27 12:03:04 crc kubenswrapper[4775]: I0127 12:03:04.962765 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" event={"ID":"36bee79d-4a97-407b-9907-87d740929ba0","Type":"ContainerDied","Data":"0d10c4e2f56c82fb2bd1a827f287080bf4533c4f837f84b4f88c1945f15b20ca"} Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.371798 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.498988 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499094 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499151 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499250 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499320 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499348 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499418 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499466 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499551 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz55c\" (UniqueName: \"kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.507384 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c" (OuterVolumeSpecName: "kube-api-access-cz55c") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "kube-api-access-cz55c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.507574 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.533706 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.533733 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.534009 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.537400 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory" (OuterVolumeSpecName: "inventory") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.539923 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.544739 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.552586 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603583 4775 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603629 4775 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603651 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603663 4775 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603707 4775 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603720 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz55c\" (UniqueName: \"kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603735 4775 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603751 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603762 4775 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.981127 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" event={"ID":"36bee79d-4a97-407b-9907-87d740929ba0","Type":"ContainerDied","Data":"c0f0e60fd0308f16eb0ba529574f44937b12975ddccb5ca3ee21176c89d848a1"} Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.981168 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f0e60fd0308f16eb0ba529574f44937b12975ddccb5ca3ee21176c89d848a1" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.981239 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.096912 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd"] Jan 27 12:03:07 crc kubenswrapper[4775]: E0127 12:03:07.097335 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce874bb-50b0-4a56-a322-f5590c1d19bd" containerName="keystone-cron" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.097361 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce874bb-50b0-4a56-a322-f5590c1d19bd" containerName="keystone-cron" Jan 27 12:03:07 crc kubenswrapper[4775]: E0127 12:03:07.097404 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36bee79d-4a97-407b-9907-87d740929ba0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.097414 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="36bee79d-4a97-407b-9907-87d740929ba0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.097768 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="36bee79d-4a97-407b-9907-87d740929ba0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.097806 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce874bb-50b0-4a56-a322-f5590c1d19bd" containerName="keystone-cron" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.098538 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.102069 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.102343 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.102655 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.102886 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.103100 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.108875 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd"] Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133334 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8qrs\" (UniqueName: \"kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133385 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133414 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133434 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133488 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133522 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133573 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235333 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8qrs\" (UniqueName: \"kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235403 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235459 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235492 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235517 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235546 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235575 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.240775 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.240985 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.241000 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.241941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.242370 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.242790 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.254047 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8qrs\" (UniqueName: \"kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.438389 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:08 crc kubenswrapper[4775]: I0127 12:03:08.026038 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd"] Jan 27 12:03:08 crc kubenswrapper[4775]: I0127 12:03:08.033417 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 12:03:09 crc kubenswrapper[4775]: I0127 12:03:09.002275 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" event={"ID":"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398","Type":"ContainerStarted","Data":"8073976c70e0ee717b2782c46f8e7a8d92d5ccef8cc4787bbe1d623fac532fde"} Jan 27 12:03:09 crc kubenswrapper[4775]: I0127 12:03:09.002762 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" event={"ID":"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398","Type":"ContainerStarted","Data":"e2c9957e6919cf21a73a6a159a6ac6b856a70c65eadfdcf7e4c5a6fc0fb126a6"} Jan 27 12:03:09 crc kubenswrapper[4775]: I0127 12:03:09.025980 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" podStartSLOduration=1.549605085 podStartE2EDuration="2.025961373s" podCreationTimestamp="2026-01-27 12:03:07 +0000 UTC" firstStartedPulling="2026-01-27 12:03:08.033209927 +0000 UTC m=+2567.174807704" lastFinishedPulling="2026-01-27 12:03:08.509566195 +0000 UTC m=+2567.651163992" observedRunningTime="2026-01-27 12:03:09.020525797 +0000 UTC m=+2568.162123584" watchObservedRunningTime="2026-01-27 12:03:09.025961373 +0000 UTC m=+2568.167559150" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.676216 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.695041 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.707894 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.840219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmdcc\" (UniqueName: \"kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.840353 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.840389 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.942327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.942383 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.942473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmdcc\" (UniqueName: \"kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.943154 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.943364 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.969223 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmdcc\" (UniqueName: \"kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:24 crc kubenswrapper[4775]: I0127 12:03:24.025289 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:25 crc kubenswrapper[4775]: W0127 12:03:25.643761 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod452087fc_eb7e_4fb0_9c5b_cc827c8fc32e.slice/crio-d1db8bdfd16475be48f59ac6b91a7f8b7ce421e2632029524e8f3c6b7399db3f WatchSource:0}: Error finding container d1db8bdfd16475be48f59ac6b91a7f8b7ce421e2632029524e8f3c6b7399db3f: Status 404 returned error can't find the container with id d1db8bdfd16475be48f59ac6b91a7f8b7ce421e2632029524e8f3c6b7399db3f Jan 27 12:03:25 crc kubenswrapper[4775]: I0127 12:03:25.645688 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:26 crc kubenswrapper[4775]: I0127 12:03:26.159324 4775 generic.go:334] "Generic (PLEG): container finished" podID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerID="38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032" exitCode=0 Jan 27 12:03:26 crc kubenswrapper[4775]: I0127 12:03:26.159387 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerDied","Data":"38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032"} Jan 27 12:03:26 crc kubenswrapper[4775]: I0127 12:03:26.159730 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerStarted","Data":"d1db8bdfd16475be48f59ac6b91a7f8b7ce421e2632029524e8f3c6b7399db3f"} Jan 27 12:03:27 crc kubenswrapper[4775]: I0127 12:03:27.187727 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerStarted","Data":"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9"} Jan 27 12:03:28 crc kubenswrapper[4775]: I0127 12:03:28.200126 4775 generic.go:334] "Generic (PLEG): container finished" podID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerID="32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9" exitCode=0 Jan 27 12:03:28 crc kubenswrapper[4775]: I0127 12:03:28.200179 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerDied","Data":"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9"} Jan 27 12:03:29 crc kubenswrapper[4775]: I0127 12:03:29.213611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerStarted","Data":"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd"} Jan 27 12:03:29 crc kubenswrapper[4775]: I0127 12:03:29.245757 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lzgfp" podStartSLOduration=3.825975781 podStartE2EDuration="6.245737595s" podCreationTimestamp="2026-01-27 12:03:23 +0000 UTC" firstStartedPulling="2026-01-27 12:03:26.160988929 +0000 UTC m=+2585.302586706" lastFinishedPulling="2026-01-27 12:03:28.580750743 +0000 UTC m=+2587.722348520" observedRunningTime="2026-01-27 12:03:29.235829519 +0000 UTC m=+2588.377427296" watchObservedRunningTime="2026-01-27 12:03:29.245737595 +0000 UTC m=+2588.387335372" Jan 27 12:03:34 crc kubenswrapper[4775]: I0127 12:03:34.025733 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:34 crc kubenswrapper[4775]: I0127 12:03:34.026247 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:34 crc kubenswrapper[4775]: I0127 12:03:34.069053 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:34 crc kubenswrapper[4775]: I0127 12:03:34.304794 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:34 crc kubenswrapper[4775]: I0127 12:03:34.355313 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.264493 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lzgfp" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="registry-server" containerID="cri-o://dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd" gracePeriod=2 Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.725314 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.893936 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities\") pod \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.894254 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmdcc\" (UniqueName: \"kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc\") pod \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.894394 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content\") pod \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.894957 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities" (OuterVolumeSpecName: "utilities") pod "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" (UID: "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.896389 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.900546 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc" (OuterVolumeSpecName: "kube-api-access-wmdcc") pod "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" (UID: "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e"). InnerVolumeSpecName "kube-api-access-wmdcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.954131 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" (UID: "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.998404 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmdcc\" (UniqueName: \"kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.998622 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.280952 4775 generic.go:334] "Generic (PLEG): container finished" podID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerID="dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd" exitCode=0 Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.281061 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerDied","Data":"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd"} Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.281153 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerDied","Data":"d1db8bdfd16475be48f59ac6b91a7f8b7ce421e2632029524e8f3c6b7399db3f"} Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.281179 4775 scope.go:117] "RemoveContainer" containerID="dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.281084 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.306726 4775 scope.go:117] "RemoveContainer" containerID="32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.332226 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.343320 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.356476 4775 scope.go:117] "RemoveContainer" containerID="38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.396743 4775 scope.go:117] "RemoveContainer" containerID="dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd" Jan 27 12:03:37 crc kubenswrapper[4775]: E0127 12:03:37.397354 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd\": container with ID starting with dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd not found: ID does not exist" containerID="dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.397401 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd"} err="failed to get container status \"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd\": rpc error: code = NotFound desc = could not find container \"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd\": container with ID starting with dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd not found: ID does not exist" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.397431 4775 scope.go:117] "RemoveContainer" containerID="32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9" Jan 27 12:03:37 crc kubenswrapper[4775]: E0127 12:03:37.397793 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9\": container with ID starting with 32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9 not found: ID does not exist" containerID="32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.397837 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9"} err="failed to get container status \"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9\": rpc error: code = NotFound desc = could not find container \"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9\": container with ID starting with 32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9 not found: ID does not exist" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.397864 4775 scope.go:117] "RemoveContainer" containerID="38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032" Jan 27 12:03:37 crc kubenswrapper[4775]: E0127 12:03:37.398241 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032\": container with ID starting with 38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032 not found: ID does not exist" containerID="38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.398308 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032"} err="failed to get container status \"38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032\": rpc error: code = NotFound desc = could not find container \"38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032\": container with ID starting with 38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032 not found: ID does not exist" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.755527 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" path="/var/lib/kubelet/pods/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e/volumes" Jan 27 12:05:29 crc kubenswrapper[4775]: I0127 12:05:29.518053 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:05:29 crc kubenswrapper[4775]: I0127 12:05:29.518725 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:05:47 crc kubenswrapper[4775]: I0127 12:05:47.870921 4775 generic.go:334] "Generic (PLEG): container finished" podID="c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" containerID="8073976c70e0ee717b2782c46f8e7a8d92d5ccef8cc4787bbe1d623fac532fde" exitCode=0 Jan 27 12:05:47 crc kubenswrapper[4775]: I0127 12:05:47.871009 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" event={"ID":"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398","Type":"ContainerDied","Data":"8073976c70e0ee717b2782c46f8e7a8d92d5ccef8cc4787bbe1d623fac532fde"} Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.264686 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.385467 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8qrs\" (UniqueName: \"kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.385590 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.386034 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.386099 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.386177 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.386207 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.386279 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.395724 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs" (OuterVolumeSpecName: "kube-api-access-d8qrs") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "kube-api-access-d8qrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.397092 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.420311 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.422278 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.423301 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.424622 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.431652 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory" (OuterVolumeSpecName: "inventory") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489392 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8qrs\" (UniqueName: \"kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489430 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489440 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489525 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489535 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489543 4775 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489575 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.891965 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" event={"ID":"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398","Type":"ContainerDied","Data":"e2c9957e6919cf21a73a6a159a6ac6b856a70c65eadfdcf7e4c5a6fc0fb126a6"} Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.892031 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.892046 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2c9957e6919cf21a73a6a159a6ac6b856a70c65eadfdcf7e4c5a6fc0fb126a6" Jan 27 12:05:49 crc kubenswrapper[4775]: E0127 12:05:49.924979 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5bab8d8_2ee4_4499_aa5a_9fe4f21ad398.slice/crio-e2c9957e6919cf21a73a6a159a6ac6b856a70c65eadfdcf7e4c5a6fc0fb126a6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5bab8d8_2ee4_4499_aa5a_9fe4f21ad398.slice\": RecentStats: unable to find data in memory cache]" Jan 27 12:05:59 crc kubenswrapper[4775]: I0127 12:05:59.518126 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:05:59 crc kubenswrapper[4775]: I0127 12:05:59.518769 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:06:29 crc kubenswrapper[4775]: I0127 12:06:29.517526 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:06:29 crc kubenswrapper[4775]: I0127 12:06:29.518117 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:06:29 crc kubenswrapper[4775]: I0127 12:06:29.518178 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 12:06:29 crc kubenswrapper[4775]: I0127 12:06:29.519062 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 12:06:29 crc kubenswrapper[4775]: I0127 12:06:29.519129 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365" gracePeriod=600 Jan 27 12:06:30 crc kubenswrapper[4775]: I0127 12:06:30.292652 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365" exitCode=0 Jan 27 12:06:30 crc kubenswrapper[4775]: I0127 12:06:30.292738 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365"} Jan 27 12:06:30 crc kubenswrapper[4775]: I0127 12:06:30.293375 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:06:36 crc kubenswrapper[4775]: I0127 12:06:36.814104 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 27 12:06:37 crc kubenswrapper[4775]: I0127 12:06:37.390164 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a"} Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.804574 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 27 12:07:01 crc kubenswrapper[4775]: E0127 12:07:01.805525 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="extract-utilities" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805540 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="extract-utilities" Jan 27 12:07:01 crc kubenswrapper[4775]: E0127 12:07:01.805565 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="registry-server" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805571 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="registry-server" Jan 27 12:07:01 crc kubenswrapper[4775]: E0127 12:07:01.805584 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="extract-content" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805590 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="extract-content" Jan 27 12:07:01 crc kubenswrapper[4775]: E0127 12:07:01.805605 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805612 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805824 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805846 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="registry-server" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.806626 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.812300 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.813222 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.815063 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-m62zz" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.815252 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.819061 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.873282 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.874071 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.874296 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976101 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976162 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976272 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976303 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976400 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976460 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976538 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spdp7\" (UniqueName: \"kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976566 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.977213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.977540 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.983175 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078140 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078234 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078323 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spdp7\" (UniqueName: \"kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078343 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078372 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078408 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078650 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078938 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.080279 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.083066 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.084112 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.096300 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spdp7\" (UniqueName: \"kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.107312 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.137972 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.676599 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 27 12:07:03 crc kubenswrapper[4775]: I0127 12:07:03.612197 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4","Type":"ContainerStarted","Data":"00ec443440cadc0301613dc3ce658f1a7eee2be5cf8cf133ac8dca49daa5e2da"} Jan 27 12:07:34 crc kubenswrapper[4775]: E0127 12:07:34.169715 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 27 12:07:34 crc kubenswrapper[4775]: E0127 12:07:34.170535 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-spdp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 12:07:34 crc kubenswrapper[4775]: E0127 12:07:34.171788 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" Jan 27 12:07:34 crc kubenswrapper[4775]: E0127 12:07:34.934549 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" Jan 27 12:07:48 crc kubenswrapper[4775]: I0127 12:07:48.550875 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 27 12:07:50 crc kubenswrapper[4775]: I0127 12:07:50.072280 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4","Type":"ContainerStarted","Data":"40fc909f0ec8c053ea1716eb162721fd6c0dfff06bce588ecd82e1bf26830748"} Jan 27 12:07:50 crc kubenswrapper[4775]: I0127 12:07:50.095090 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.226045863 podStartE2EDuration="50.095064914s" podCreationTimestamp="2026-01-27 12:07:00 +0000 UTC" firstStartedPulling="2026-01-27 12:07:02.679610082 +0000 UTC m=+2801.821207859" lastFinishedPulling="2026-01-27 12:07:48.548629133 +0000 UTC m=+2847.690226910" observedRunningTime="2026-01-27 12:07:50.093584645 +0000 UTC m=+2849.235182432" watchObservedRunningTime="2026-01-27 12:07:50.095064914 +0000 UTC m=+2849.236662691" Jan 27 12:07:58 crc kubenswrapper[4775]: I0127 12:07:58.144722 4775 generic.go:334] "Generic (PLEG): container finished" podID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" containerID="40fc909f0ec8c053ea1716eb162721fd6c0dfff06bce588ecd82e1bf26830748" exitCode=123 Jan 27 12:07:58 crc kubenswrapper[4775]: I0127 12:07:58.144842 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4","Type":"ContainerDied","Data":"40fc909f0ec8c053ea1716eb162721fd6c0dfff06bce588ecd82e1bf26830748"} Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.429604 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-66648b46df-hskmp" podUID="e22ddb6f-e33b-41ea-a24f-c97c0676e6e5" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.528749 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668314 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668614 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668650 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668706 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668763 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668828 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668854 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668899 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668934 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spdp7\" (UniqueName: \"kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.669742 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.670349 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data" (OuterVolumeSpecName: "config-data") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.670477 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.676096 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.676809 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7" (OuterVolumeSpecName: "kube-api-access-spdp7") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "kube-api-access-spdp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.699818 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.700087 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.701324 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.724783 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771500 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771536 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771551 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771565 4775 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771575 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771587 4775 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771598 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771609 4775 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771621 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spdp7\" (UniqueName: \"kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.796007 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.873864 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:00 crc kubenswrapper[4775]: I0127 12:08:00.171220 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4","Type":"ContainerDied","Data":"00ec443440cadc0301613dc3ce658f1a7eee2be5cf8cf133ac8dca49daa5e2da"} Jan 27 12:08:00 crc kubenswrapper[4775]: I0127 12:08:00.171651 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00ec443440cadc0301613dc3ce658f1a7eee2be5cf8cf133ac8dca49daa5e2da" Jan 27 12:08:00 crc kubenswrapper[4775]: I0127 12:08:00.171273 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.458077 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 27 12:08:08 crc kubenswrapper[4775]: E0127 12:08:08.459205 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" containerName="tempest-tests-tempest-tests-runner" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.459227 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" containerName="tempest-tests-tempest-tests-runner" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.459519 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" containerName="tempest-tests-tempest-tests-runner" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.460250 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.463250 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-m62zz" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.469255 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.554631 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.554761 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb99p\" (UniqueName: \"kubernetes.io/projected/9d17d9d1-39f7-417c-b058-cda582c7f7d3-kube-api-access-tb99p\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.657084 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb99p\" (UniqueName: \"kubernetes.io/projected/9d17d9d1-39f7-417c-b058-cda582c7f7d3-kube-api-access-tb99p\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.657860 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.658366 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.682662 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb99p\" (UniqueName: \"kubernetes.io/projected/9d17d9d1-39f7-417c-b058-cda582c7f7d3-kube-api-access-tb99p\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.686992 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.792194 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:09 crc kubenswrapper[4775]: I0127 12:08:09.266946 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 27 12:08:09 crc kubenswrapper[4775]: W0127 12:08:09.268926 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d17d9d1_39f7_417c_b058_cda582c7f7d3.slice/crio-c41b5854d1aa4d5b8c41bd8210a19bc84d1026b1edd74d1414b01d9996fb08b8 WatchSource:0}: Error finding container c41b5854d1aa4d5b8c41bd8210a19bc84d1026b1edd74d1414b01d9996fb08b8: Status 404 returned error can't find the container with id c41b5854d1aa4d5b8c41bd8210a19bc84d1026b1edd74d1414b01d9996fb08b8 Jan 27 12:08:09 crc kubenswrapper[4775]: I0127 12:08:09.272013 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 12:08:10 crc kubenswrapper[4775]: I0127 12:08:10.260846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9d17d9d1-39f7-417c-b058-cda582c7f7d3","Type":"ContainerStarted","Data":"c41b5854d1aa4d5b8c41bd8210a19bc84d1026b1edd74d1414b01d9996fb08b8"} Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.271590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9d17d9d1-39f7-417c-b058-cda582c7f7d3","Type":"ContainerStarted","Data":"ea19d02276e7877ad22f2cd609aa15158b05d96b97422667d1ded2ab38bb1384"} Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.288437 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.16042419 podStartE2EDuration="3.288412766s" podCreationTimestamp="2026-01-27 12:08:08 +0000 UTC" firstStartedPulling="2026-01-27 12:08:09.271731428 +0000 UTC m=+2868.413329205" lastFinishedPulling="2026-01-27 12:08:10.399720004 +0000 UTC m=+2869.541317781" observedRunningTime="2026-01-27 12:08:11.285900559 +0000 UTC m=+2870.427498336" watchObservedRunningTime="2026-01-27 12:08:11.288412766 +0000 UTC m=+2870.430010543" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.641782 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.644122 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.658288 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.723193 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.723386 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ksc6\" (UniqueName: \"kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.723649 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.827325 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.827476 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ksc6\" (UniqueName: \"kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.827599 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.828157 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.828266 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.857614 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ksc6\" (UniqueName: \"kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.962936 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:12 crc kubenswrapper[4775]: I0127 12:08:12.515613 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:13 crc kubenswrapper[4775]: I0127 12:08:13.292409 4775 generic.go:334] "Generic (PLEG): container finished" podID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerID="0d3daee2a63c141891156804999c85b76b5da3af98a91ac88ceee1184f9daa1b" exitCode=0 Jan 27 12:08:13 crc kubenswrapper[4775]: I0127 12:08:13.292499 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerDied","Data":"0d3daee2a63c141891156804999c85b76b5da3af98a91ac88ceee1184f9daa1b"} Jan 27 12:08:13 crc kubenswrapper[4775]: I0127 12:08:13.292848 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerStarted","Data":"49113f27bda5f8b5de5626a113acf85f302b42d3df20c2e33e6424f2d1c022cf"} Jan 27 12:08:15 crc kubenswrapper[4775]: I0127 12:08:15.313315 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerStarted","Data":"0efe146bb28c9c980ce61511ad6dc594182c0b72d3f190d3957a78c261f5e852"} Jan 27 12:08:16 crc kubenswrapper[4775]: I0127 12:08:16.328573 4775 generic.go:334] "Generic (PLEG): container finished" podID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerID="0efe146bb28c9c980ce61511ad6dc594182c0b72d3f190d3957a78c261f5e852" exitCode=0 Jan 27 12:08:16 crc kubenswrapper[4775]: I0127 12:08:16.328633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerDied","Data":"0efe146bb28c9c980ce61511ad6dc594182c0b72d3f190d3957a78c261f5e852"} Jan 27 12:08:18 crc kubenswrapper[4775]: I0127 12:08:18.351918 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerStarted","Data":"26668656c40dde07369c921dabbcccf1fec82c2945ab9d9233c398f34d167320"} Jan 27 12:08:18 crc kubenswrapper[4775]: I0127 12:08:18.380262 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g96dh" podStartSLOduration=3.278407124 podStartE2EDuration="7.380232258s" podCreationTimestamp="2026-01-27 12:08:11 +0000 UTC" firstStartedPulling="2026-01-27 12:08:13.29780845 +0000 UTC m=+2872.439406227" lastFinishedPulling="2026-01-27 12:08:17.399633584 +0000 UTC m=+2876.541231361" observedRunningTime="2026-01-27 12:08:18.373411766 +0000 UTC m=+2877.515009563" watchObservedRunningTime="2026-01-27 12:08:18.380232258 +0000 UTC m=+2877.521830035" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.246422 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.251656 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.266413 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.395950 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.396130 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.396174 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdlj\" (UniqueName: \"kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.498589 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.498826 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.498875 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdlj\" (UniqueName: \"kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.499575 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.499667 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.524033 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdlj\" (UniqueName: \"kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.585860 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:20 crc kubenswrapper[4775]: I0127 12:08:20.159485 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:20 crc kubenswrapper[4775]: W0127 12:08:20.166762 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be7b52b_2651_4ee2_ab40_fef637a295e9.slice/crio-368ff5a2426ddc48e238a39b06650b457d235651479833eac083cc173837314b WatchSource:0}: Error finding container 368ff5a2426ddc48e238a39b06650b457d235651479833eac083cc173837314b: Status 404 returned error can't find the container with id 368ff5a2426ddc48e238a39b06650b457d235651479833eac083cc173837314b Jan 27 12:08:20 crc kubenswrapper[4775]: I0127 12:08:20.380868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerStarted","Data":"368ff5a2426ddc48e238a39b06650b457d235651479833eac083cc173837314b"} Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.035804 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.040194 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.081158 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.142362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.143537 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s967\" (UniqueName: \"kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.143805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.245600 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.246352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.247014 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s967\" (UniqueName: \"kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.248207 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.248488 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.285481 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s967\" (UniqueName: \"kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.389204 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.398634 4775 generic.go:334] "Generic (PLEG): container finished" podID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerID="ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776" exitCode=0 Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.398710 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerDied","Data":"ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776"} Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.944101 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:21 crc kubenswrapper[4775]: W0127 12:08:21.951051 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod435eff0a_268d_44de_921d_217e8067a11d.slice/crio-5008e14a9b7681092dc9f91d538f9eceeb0a4e02f7d34a2b791d82b87e0f96a9 WatchSource:0}: Error finding container 5008e14a9b7681092dc9f91d538f9eceeb0a4e02f7d34a2b791d82b87e0f96a9: Status 404 returned error can't find the container with id 5008e14a9b7681092dc9f91d538f9eceeb0a4e02f7d34a2b791d82b87e0f96a9 Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.964020 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.964087 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.030218 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.428736 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerStarted","Data":"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4"} Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.442341 4775 generic.go:334] "Generic (PLEG): container finished" podID="435eff0a-268d-44de-921d-217e8067a11d" containerID="be239197a6c3fe66ea7572883d434bb853af0124ecafe455b058cffc8a6425f9" exitCode=0 Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.442486 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerDied","Data":"be239197a6c3fe66ea7572883d434bb853af0124ecafe455b058cffc8a6425f9"} Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.442545 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerStarted","Data":"5008e14a9b7681092dc9f91d538f9eceeb0a4e02f7d34a2b791d82b87e0f96a9"} Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.521415 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:23 crc kubenswrapper[4775]: I0127 12:08:23.456117 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerStarted","Data":"8020f3a9978f9caa03e8a67aaf043bed33e7490425d2cdc1f84006a857741cf7"} Jan 27 12:08:23 crc kubenswrapper[4775]: I0127 12:08:23.460737 4775 generic.go:334] "Generic (PLEG): container finished" podID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerID="556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4" exitCode=0 Jan 27 12:08:23 crc kubenswrapper[4775]: I0127 12:08:23.460818 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerDied","Data":"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4"} Jan 27 12:08:24 crc kubenswrapper[4775]: I0127 12:08:24.473474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerStarted","Data":"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e"} Jan 27 12:08:24 crc kubenswrapper[4775]: I0127 12:08:24.498862 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9vzf4" podStartSLOduration=2.84972763 podStartE2EDuration="5.498841653s" podCreationTimestamp="2026-01-27 12:08:19 +0000 UTC" firstStartedPulling="2026-01-27 12:08:21.413882951 +0000 UTC m=+2880.555480728" lastFinishedPulling="2026-01-27 12:08:24.062996974 +0000 UTC m=+2883.204594751" observedRunningTime="2026-01-27 12:08:24.494584749 +0000 UTC m=+2883.636182556" watchObservedRunningTime="2026-01-27 12:08:24.498841653 +0000 UTC m=+2883.640439430" Jan 27 12:08:25 crc kubenswrapper[4775]: I0127 12:08:25.606392 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:25 crc kubenswrapper[4775]: I0127 12:08:25.606728 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g96dh" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="registry-server" containerID="cri-o://26668656c40dde07369c921dabbcccf1fec82c2945ab9d9233c398f34d167320" gracePeriod=2 Jan 27 12:08:26 crc kubenswrapper[4775]: I0127 12:08:26.494916 4775 generic.go:334] "Generic (PLEG): container finished" podID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerID="26668656c40dde07369c921dabbcccf1fec82c2945ab9d9233c398f34d167320" exitCode=0 Jan 27 12:08:26 crc kubenswrapper[4775]: I0127 12:08:26.495011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerDied","Data":"26668656c40dde07369c921dabbcccf1fec82c2945ab9d9233c398f34d167320"} Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.231511 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.280428 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities\") pod \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.280576 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content\") pod \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.280672 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ksc6\" (UniqueName: \"kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6\") pod \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.281276 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities" (OuterVolumeSpecName: "utilities") pod "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" (UID: "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.287826 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6" (OuterVolumeSpecName: "kube-api-access-2ksc6") pod "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" (UID: "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32"). InnerVolumeSpecName "kube-api-access-2ksc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.320318 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" (UID: "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.382547 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.382591 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.382607 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ksc6\" (UniqueName: \"kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.506424 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.506483 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerDied","Data":"49113f27bda5f8b5de5626a113acf85f302b42d3df20c2e33e6424f2d1c022cf"} Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.506651 4775 scope.go:117] "RemoveContainer" containerID="26668656c40dde07369c921dabbcccf1fec82c2945ab9d9233c398f34d167320" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.525488 4775 scope.go:117] "RemoveContainer" containerID="0efe146bb28c9c980ce61511ad6dc594182c0b72d3f190d3957a78c261f5e852" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.540910 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.549560 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.568726 4775 scope.go:117] "RemoveContainer" containerID="0d3daee2a63c141891156804999c85b76b5da3af98a91ac88ceee1184f9daa1b" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.755816 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" path="/var/lib/kubelet/pods/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32/volumes" Jan 27 12:08:29 crc kubenswrapper[4775]: I0127 12:08:29.586746 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:29 crc kubenswrapper[4775]: I0127 12:08:29.586818 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:29 crc kubenswrapper[4775]: I0127 12:08:29.646273 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:30 crc kubenswrapper[4775]: I0127 12:08:30.586762 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:31 crc kubenswrapper[4775]: I0127 12:08:31.004661 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:32 crc kubenswrapper[4775]: I0127 12:08:32.562919 4775 generic.go:334] "Generic (PLEG): container finished" podID="435eff0a-268d-44de-921d-217e8067a11d" containerID="8020f3a9978f9caa03e8a67aaf043bed33e7490425d2cdc1f84006a857741cf7" exitCode=0 Jan 27 12:08:32 crc kubenswrapper[4775]: I0127 12:08:32.562992 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerDied","Data":"8020f3a9978f9caa03e8a67aaf043bed33e7490425d2cdc1f84006a857741cf7"} Jan 27 12:08:32 crc kubenswrapper[4775]: I0127 12:08:32.563593 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9vzf4" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="registry-server" containerID="cri-o://5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e" gracePeriod=2 Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.543987 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.591883 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerStarted","Data":"c21bc7652b9b6318bd9629d52a876a0ce691d0f0a26b3f6213f89b2b56b254ab"} Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.596480 4775 generic.go:334] "Generic (PLEG): container finished" podID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerID="5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e" exitCode=0 Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.596547 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerDied","Data":"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e"} Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.596592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerDied","Data":"368ff5a2426ddc48e238a39b06650b457d235651479833eac083cc173837314b"} Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.596617 4775 scope.go:117] "RemoveContainer" containerID="5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.596642 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.616240 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities\") pod \"5be7b52b-2651-4ee2-ab40-fef637a295e9\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.616387 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content\") pod \"5be7b52b-2651-4ee2-ab40-fef637a295e9\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.616428 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zdlj\" (UniqueName: \"kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj\") pod \"5be7b52b-2651-4ee2-ab40-fef637a295e9\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.618957 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities" (OuterVolumeSpecName: "utilities") pod "5be7b52b-2651-4ee2-ab40-fef637a295e9" (UID: "5be7b52b-2651-4ee2-ab40-fef637a295e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.625607 4775 scope.go:117] "RemoveContainer" containerID="556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.638052 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj" (OuterVolumeSpecName: "kube-api-access-9zdlj") pod "5be7b52b-2651-4ee2-ab40-fef637a295e9" (UID: "5be7b52b-2651-4ee2-ab40-fef637a295e9"). InnerVolumeSpecName "kube-api-access-9zdlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.640818 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5be7b52b-2651-4ee2-ab40-fef637a295e9" (UID: "5be7b52b-2651-4ee2-ab40-fef637a295e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.654893 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jw487" podStartSLOduration=3.111946289 podStartE2EDuration="13.654863921s" podCreationTimestamp="2026-01-27 12:08:20 +0000 UTC" firstStartedPulling="2026-01-27 12:08:22.447104369 +0000 UTC m=+2881.588702146" lastFinishedPulling="2026-01-27 12:08:32.990022001 +0000 UTC m=+2892.131619778" observedRunningTime="2026-01-27 12:08:33.627925132 +0000 UTC m=+2892.769522909" watchObservedRunningTime="2026-01-27 12:08:33.654863921 +0000 UTC m=+2892.796461698" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.700690 4775 scope.go:117] "RemoveContainer" containerID="ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.719007 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.719050 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zdlj\" (UniqueName: \"kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.719071 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.751253 4775 scope.go:117] "RemoveContainer" containerID="5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e" Jan 27 12:08:33 crc kubenswrapper[4775]: E0127 12:08:33.752157 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e\": container with ID starting with 5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e not found: ID does not exist" containerID="5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.752191 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e"} err="failed to get container status \"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e\": rpc error: code = NotFound desc = could not find container \"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e\": container with ID starting with 5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e not found: ID does not exist" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.752209 4775 scope.go:117] "RemoveContainer" containerID="556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4" Jan 27 12:08:33 crc kubenswrapper[4775]: E0127 12:08:33.752563 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4\": container with ID starting with 556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4 not found: ID does not exist" containerID="556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.752587 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4"} err="failed to get container status \"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4\": rpc error: code = NotFound desc = could not find container \"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4\": container with ID starting with 556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4 not found: ID does not exist" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.752600 4775 scope.go:117] "RemoveContainer" containerID="ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776" Jan 27 12:08:33 crc kubenswrapper[4775]: E0127 12:08:33.753003 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776\": container with ID starting with ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776 not found: ID does not exist" containerID="ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.753025 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776"} err="failed to get container status \"ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776\": rpc error: code = NotFound desc = could not find container \"ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776\": container with ID starting with ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776 not found: ID does not exist" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.930917 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.941932 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:35 crc kubenswrapper[4775]: I0127 12:08:35.755208 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" path="/var/lib/kubelet/pods/5be7b52b-2651-4ee2-ab40-fef637a295e9/volumes" Jan 27 12:08:41 crc kubenswrapper[4775]: I0127 12:08:41.390355 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:41 crc kubenswrapper[4775]: I0127 12:08:41.390994 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:42 crc kubenswrapper[4775]: I0127 12:08:42.439185 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jw487" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="registry-server" probeResult="failure" output=< Jan 27 12:08:42 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 12:08:42 crc kubenswrapper[4775]: > Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.156968 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7p92f/must-gather-wqwn4"] Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158159 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="extract-utilities" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158176 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="extract-utilities" Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158199 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158207 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158218 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="extract-utilities" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158228 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="extract-utilities" Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158245 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158253 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158266 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="extract-content" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158275 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="extract-content" Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158310 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="extract-content" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158317 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="extract-content" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158576 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158593 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.159870 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.162883 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7p92f"/"default-dockercfg-76fkb" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.163095 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7p92f"/"kube-root-ca.crt" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.163130 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7p92f"/"openshift-service-ca.crt" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.169815 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7p92f/must-gather-wqwn4"] Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.313912 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.314021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84zh6\" (UniqueName: \"kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.416266 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.416370 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84zh6\" (UniqueName: \"kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.417170 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.443246 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84zh6\" (UniqueName: \"kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.485130 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.979715 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7p92f/must-gather-wqwn4"] Jan 27 12:08:48 crc kubenswrapper[4775]: W0127 12:08:48.987441 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09caf0cd_6a8c_41d8_84a7_7813e19a373a.slice/crio-70cd7af253185801588842227f9d28568da5323c702ce7692611090ce847515b WatchSource:0}: Error finding container 70cd7af253185801588842227f9d28568da5323c702ce7692611090ce847515b: Status 404 returned error can't find the container with id 70cd7af253185801588842227f9d28568da5323c702ce7692611090ce847515b Jan 27 12:08:49 crc kubenswrapper[4775]: I0127 12:08:49.764867 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/must-gather-wqwn4" event={"ID":"09caf0cd-6a8c-41d8-84a7-7813e19a373a","Type":"ContainerStarted","Data":"70cd7af253185801588842227f9d28568da5323c702ce7692611090ce847515b"} Jan 27 12:08:51 crc kubenswrapper[4775]: I0127 12:08:51.461788 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:51 crc kubenswrapper[4775]: I0127 12:08:51.519951 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:52 crc kubenswrapper[4775]: I0127 12:08:52.224268 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:52 crc kubenswrapper[4775]: I0127 12:08:52.811772 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jw487" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="registry-server" containerID="cri-o://c21bc7652b9b6318bd9629d52a876a0ce691d0f0a26b3f6213f89b2b56b254ab" gracePeriod=2 Jan 27 12:08:53 crc kubenswrapper[4775]: I0127 12:08:53.823809 4775 generic.go:334] "Generic (PLEG): container finished" podID="435eff0a-268d-44de-921d-217e8067a11d" containerID="c21bc7652b9b6318bd9629d52a876a0ce691d0f0a26b3f6213f89b2b56b254ab" exitCode=0 Jan 27 12:08:53 crc kubenswrapper[4775]: I0127 12:08:53.824097 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerDied","Data":"c21bc7652b9b6318bd9629d52a876a0ce691d0f0a26b3f6213f89b2b56b254ab"} Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.326440 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.452194 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content\") pod \"435eff0a-268d-44de-921d-217e8067a11d\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.452755 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s967\" (UniqueName: \"kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967\") pod \"435eff0a-268d-44de-921d-217e8067a11d\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.452880 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities\") pod \"435eff0a-268d-44de-921d-217e8067a11d\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.453778 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities" (OuterVolumeSpecName: "utilities") pod "435eff0a-268d-44de-921d-217e8067a11d" (UID: "435eff0a-268d-44de-921d-217e8067a11d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.461993 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967" (OuterVolumeSpecName: "kube-api-access-7s967") pod "435eff0a-268d-44de-921d-217e8067a11d" (UID: "435eff0a-268d-44de-921d-217e8067a11d"). InnerVolumeSpecName "kube-api-access-7s967". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.555780 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s967\" (UniqueName: \"kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.555846 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.593154 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "435eff0a-268d-44de-921d-217e8067a11d" (UID: "435eff0a-268d-44de-921d-217e8067a11d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.657754 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.854050 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/must-gather-wqwn4" event={"ID":"09caf0cd-6a8c-41d8-84a7-7813e19a373a","Type":"ContainerStarted","Data":"d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3"} Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.854138 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/must-gather-wqwn4" event={"ID":"09caf0cd-6a8c-41d8-84a7-7813e19a373a","Type":"ContainerStarted","Data":"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b"} Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.857733 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerDied","Data":"5008e14a9b7681092dc9f91d538f9eceeb0a4e02f7d34a2b791d82b87e0f96a9"} Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.857784 4775 scope.go:117] "RemoveContainer" containerID="c21bc7652b9b6318bd9629d52a876a0ce691d0f0a26b3f6213f89b2b56b254ab" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.857815 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.879409 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7p92f/must-gather-wqwn4" podStartSLOduration=1.8302647109999999 podStartE2EDuration="8.879388406s" podCreationTimestamp="2026-01-27 12:08:48 +0000 UTC" firstStartedPulling="2026-01-27 12:08:48.990277434 +0000 UTC m=+2908.131875211" lastFinishedPulling="2026-01-27 12:08:56.039401129 +0000 UTC m=+2915.180998906" observedRunningTime="2026-01-27 12:08:56.873189682 +0000 UTC m=+2916.014787459" watchObservedRunningTime="2026-01-27 12:08:56.879388406 +0000 UTC m=+2916.020986183" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.896491 4775 scope.go:117] "RemoveContainer" containerID="8020f3a9978f9caa03e8a67aaf043bed33e7490425d2cdc1f84006a857741cf7" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.909497 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.921945 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.938503 4775 scope.go:117] "RemoveContainer" containerID="be239197a6c3fe66ea7572883d434bb853af0124ecafe455b058cffc8a6425f9" Jan 27 12:08:57 crc kubenswrapper[4775]: I0127 12:08:57.755416 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435eff0a-268d-44de-921d-217e8067a11d" path="/var/lib/kubelet/pods/435eff0a-268d-44de-921d-217e8067a11d/volumes" Jan 27 12:08:59 crc kubenswrapper[4775]: I0127 12:08:59.518413 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:08:59 crc kubenswrapper[4775]: I0127 12:08:59.518825 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.475393 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7p92f/crc-debug-fnhcr"] Jan 27 12:09:00 crc kubenswrapper[4775]: E0127 12:09:00.476126 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="extract-utilities" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.476146 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="extract-utilities" Jan 27 12:09:00 crc kubenswrapper[4775]: E0127 12:09:00.476177 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="extract-content" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.476184 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="extract-content" Jan 27 12:09:00 crc kubenswrapper[4775]: E0127 12:09:00.476202 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="registry-server" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.476210 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="registry-server" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.476429 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="registry-server" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.477157 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.640217 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrssq\" (UniqueName: \"kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.640854 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.742838 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrssq\" (UniqueName: \"kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.742913 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.743118 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.769380 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrssq\" (UniqueName: \"kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.798109 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: W0127 12:09:00.830604 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c262d80_3666_411e_9947_d5ab93033fa7.slice/crio-14669a2847dcc22b4f70d5cc48c3bdb539db5dca9938911ad0a69f4165258b10 WatchSource:0}: Error finding container 14669a2847dcc22b4f70d5cc48c3bdb539db5dca9938911ad0a69f4165258b10: Status 404 returned error can't find the container with id 14669a2847dcc22b4f70d5cc48c3bdb539db5dca9938911ad0a69f4165258b10 Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.901550 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" event={"ID":"9c262d80-3666-411e-9947-d5ab93033fa7","Type":"ContainerStarted","Data":"14669a2847dcc22b4f70d5cc48c3bdb539db5dca9938911ad0a69f4165258b10"} Jan 27 12:09:02 crc kubenswrapper[4775]: E0127 12:09:02.328610 4775 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.22:35796->38.102.83.22:36975: read tcp 38.102.83.22:35796->38.102.83.22:36975: read: connection reset by peer Jan 27 12:09:16 crc kubenswrapper[4775]: E0127 12:09:16.653989 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Jan 27 12:09:16 crc kubenswrapper[4775]: E0127 12:09:16.654898 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrssq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-fnhcr_openshift-must-gather-7p92f(9c262d80-3666-411e-9947-d5ab93033fa7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 12:09:16 crc kubenswrapper[4775]: E0127 12:09:16.656162 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" Jan 27 12:09:17 crc kubenswrapper[4775]: E0127 12:09:17.084518 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" Jan 27 12:09:29 crc kubenswrapper[4775]: I0127 12:09:29.518001 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:09:29 crc kubenswrapper[4775]: I0127 12:09:29.518696 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:09:37 crc kubenswrapper[4775]: I0127 12:09:37.287490 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" event={"ID":"9c262d80-3666-411e-9947-d5ab93033fa7","Type":"ContainerStarted","Data":"baa01a4c6fe93fc697e5252cef256367e24ac68983a3bf4c9c9429de1629fe05"} Jan 27 12:09:37 crc kubenswrapper[4775]: I0127 12:09:37.303258 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" podStartSLOduration=1.6889406710000001 podStartE2EDuration="37.303236822s" podCreationTimestamp="2026-01-27 12:09:00 +0000 UTC" firstStartedPulling="2026-01-27 12:09:00.833273022 +0000 UTC m=+2919.974870799" lastFinishedPulling="2026-01-27 12:09:36.447569173 +0000 UTC m=+2955.589166950" observedRunningTime="2026-01-27 12:09:37.300341891 +0000 UTC m=+2956.441939668" watchObservedRunningTime="2026-01-27 12:09:37.303236822 +0000 UTC m=+2956.444834589" Jan 27 12:09:59 crc kubenswrapper[4775]: I0127 12:09:59.518107 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:09:59 crc kubenswrapper[4775]: I0127 12:09:59.518753 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:09:59 crc kubenswrapper[4775]: I0127 12:09:59.518817 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 12:09:59 crc kubenswrapper[4775]: I0127 12:09:59.519852 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 12:09:59 crc kubenswrapper[4775]: I0127 12:09:59.519916 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" gracePeriod=600 Jan 27 12:09:59 crc kubenswrapper[4775]: E0127 12:09:59.650753 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:10:00 crc kubenswrapper[4775]: I0127 12:10:00.527517 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" exitCode=0 Jan 27 12:10:00 crc kubenswrapper[4775]: I0127 12:10:00.527824 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a"} Jan 27 12:10:00 crc kubenswrapper[4775]: I0127 12:10:00.527864 4775 scope.go:117] "RemoveContainer" containerID="e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365" Jan 27 12:10:00 crc kubenswrapper[4775]: I0127 12:10:00.528566 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:10:00 crc kubenswrapper[4775]: E0127 12:10:00.528811 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:10:06 crc kubenswrapper[4775]: I0127 12:10:06.583986 4775 generic.go:334] "Generic (PLEG): container finished" podID="9c262d80-3666-411e-9947-d5ab93033fa7" containerID="baa01a4c6fe93fc697e5252cef256367e24ac68983a3bf4c9c9429de1629fe05" exitCode=0 Jan 27 12:10:06 crc kubenswrapper[4775]: I0127 12:10:06.584077 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" event={"ID":"9c262d80-3666-411e-9947-d5ab93033fa7","Type":"ContainerDied","Data":"baa01a4c6fe93fc697e5252cef256367e24ac68983a3bf4c9c9429de1629fe05"} Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.708678 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.760359 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7p92f/crc-debug-fnhcr"] Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.760408 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7p92f/crc-debug-fnhcr"] Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.869991 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrssq\" (UniqueName: \"kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq\") pod \"9c262d80-3666-411e-9947-d5ab93033fa7\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.870084 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host\") pod \"9c262d80-3666-411e-9947-d5ab93033fa7\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.870295 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host" (OuterVolumeSpecName: "host") pod "9c262d80-3666-411e-9947-d5ab93033fa7" (UID: "9c262d80-3666-411e-9947-d5ab93033fa7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.870830 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host\") on node \"crc\" DevicePath \"\"" Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.876483 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq" (OuterVolumeSpecName: "kube-api-access-wrssq") pod "9c262d80-3666-411e-9947-d5ab93033fa7" (UID: "9c262d80-3666-411e-9947-d5ab93033fa7"). InnerVolumeSpecName "kube-api-access-wrssq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.972619 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrssq\" (UniqueName: \"kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq\") on node \"crc\" DevicePath \"\"" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.603922 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14669a2847dcc22b4f70d5cc48c3bdb539db5dca9938911ad0a69f4165258b10" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.603967 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.902116 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7p92f/crc-debug-6pwdd"] Jan 27 12:10:08 crc kubenswrapper[4775]: E0127 12:10:08.902502 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" containerName="container-00" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.902514 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" containerName="container-00" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.902689 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" containerName="container-00" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.903291 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.991194 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfrrd\" (UniqueName: \"kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.991276 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.093540 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfrrd\" (UniqueName: \"kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.093658 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.093753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.114421 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfrrd\" (UniqueName: \"kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.224283 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.613230 4775 generic.go:334] "Generic (PLEG): container finished" podID="68e8da4d-550a-40eb-b851-4e7f2b637352" containerID="5ad2805fe3e9e0db329c40394639e7fe126a36fdd4ced2486cd1e45b12b8c1a1" exitCode=1 Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.613296 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" event={"ID":"68e8da4d-550a-40eb-b851-4e7f2b637352","Type":"ContainerDied","Data":"5ad2805fe3e9e0db329c40394639e7fe126a36fdd4ced2486cd1e45b12b8c1a1"} Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.613535 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" event={"ID":"68e8da4d-550a-40eb-b851-4e7f2b637352","Type":"ContainerStarted","Data":"39e44386371866f95623ce009248c5babf2f1cd2da65943d9bb0221095e0c22c"} Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.653667 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7p92f/crc-debug-6pwdd"] Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.663731 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7p92f/crc-debug-6pwdd"] Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.756028 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" path="/var/lib/kubelet/pods/9c262d80-3666-411e-9947-d5ab93033fa7/volumes" Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.727922 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.830408 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host\") pod \"68e8da4d-550a-40eb-b851-4e7f2b637352\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.830624 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfrrd\" (UniqueName: \"kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd\") pod \"68e8da4d-550a-40eb-b851-4e7f2b637352\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.830788 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host" (OuterVolumeSpecName: "host") pod "68e8da4d-550a-40eb-b851-4e7f2b637352" (UID: "68e8da4d-550a-40eb-b851-4e7f2b637352"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.831178 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host\") on node \"crc\" DevicePath \"\"" Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.836676 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd" (OuterVolumeSpecName: "kube-api-access-gfrrd") pod "68e8da4d-550a-40eb-b851-4e7f2b637352" (UID: "68e8da4d-550a-40eb-b851-4e7f2b637352"). InnerVolumeSpecName "kube-api-access-gfrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.933035 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfrrd\" (UniqueName: \"kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd\") on node \"crc\" DevicePath \"\"" Jan 27 12:10:11 crc kubenswrapper[4775]: I0127 12:10:11.630948 4775 scope.go:117] "RemoveContainer" containerID="5ad2805fe3e9e0db329c40394639e7fe126a36fdd4ced2486cd1e45b12b8c1a1" Jan 27 12:10:11 crc kubenswrapper[4775]: I0127 12:10:11.630995 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:11 crc kubenswrapper[4775]: I0127 12:10:11.755814 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e8da4d-550a-40eb-b851-4e7f2b637352" path="/var/lib/kubelet/pods/68e8da4d-550a-40eb-b851-4e7f2b637352/volumes" Jan 27 12:10:12 crc kubenswrapper[4775]: I0127 12:10:12.744702 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:10:12 crc kubenswrapper[4775]: E0127 12:10:12.745096 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:10:25 crc kubenswrapper[4775]: I0127 12:10:25.746049 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:10:25 crc kubenswrapper[4775]: E0127 12:10:25.748871 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:10:26 crc kubenswrapper[4775]: I0127 12:10:26.875006 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d66b74d76-ngwn9_8fa6c814-723c-4638-8ae9-dbb9f6864120/barbican-api/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.064470 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d66b74d76-ngwn9_8fa6c814-723c-4638-8ae9-dbb9f6864120/barbican-api-log/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.158620 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78f66698d-fbfmx_1138f75c-8e56-4a32-8110-8b26d9f80688/barbican-keystone-listener-log/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.183482 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78f66698d-fbfmx_1138f75c-8e56-4a32-8110-8b26d9f80688/barbican-keystone-listener/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.354561 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5bd6cd4f4f-kxhrc_8874fbc9-9d42-45dd-b38b-9ba1a33340f5/barbican-worker-log/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.383178 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5bd6cd4f4f-kxhrc_8874fbc9-9d42-45dd-b38b-9ba1a33340f5/barbican-worker/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.558771 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw_ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.637782 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/ceilometer-central-agent/1.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.683604 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/ceilometer-central-agent/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.760264 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/ceilometer-notification-agent/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.798266 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/ceilometer-notification-agent/1.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.859220 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/proxy-httpd/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.889496 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/sg-core/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.045497 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3d670312-cbe8-44de-8f6f-857772d2af05/cinder-api/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.062201 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3d670312-cbe8-44de-8f6f-857772d2af05/cinder-api-log/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.262746 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_030ef7f1-5f79-42e9-800e-55c4f70964e5/cinder-scheduler/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.284597 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_030ef7f1-5f79-42e9-800e-55c4f70964e5/probe/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.445989 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-spnbk_d688b7ee-365a-441b-a0ab-3d1cf6663988/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.545967 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb_a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.660871 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-knrgp_f6c54a70-a562-4fef-b3fe-14e2a3029229/init/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.807392 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-knrgp_f6c54a70-a562-4fef-b3fe-14e2a3029229/init/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.829476 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-knrgp_f6c54a70-a562-4fef-b3fe-14e2a3029229/dnsmasq-dns/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.879815 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wskgh_e018489b-9445-4afb-8e4c-e9d52a6781d7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.070473 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_899a9893-167d-4c9c-9495-3c663c7d0855/glance-httpd/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.092418 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_899a9893-167d-4c9c-9495-3c663c7d0855/glance-log/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.259123 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2d8a9ef1-1171-438f-be81-89f670bd9735/glance-httpd/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.282683 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2d8a9ef1-1171-438f-be81-89f670bd9735/glance-log/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.516287 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6546ffcc78-4zdnk_00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4/horizon/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.649960 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-czgtf_d002bd2d-2dcd-4ba3-841b-1306c023469b/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.775867 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6546ffcc78-4zdnk_00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4/horizon-log/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.858212 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-87v8z_2a28c09e-4891-433d-a745-f3dcfc8654aa/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.055624 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29491921-2bnsm_5ce874bb-50b0-4a56-a322-f5590c1d19bd/keystone-cron/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.127289 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5994598694-dhq5v_94f53f42-a5fc-45f9-b94c-4f12b63d8d75/keystone-api/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.262888 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7aa68248-0707-4f5c-8689-57cf6d07c250/kube-state-metrics/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.360331 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm_7ab3ce35-77fe-4e38-ad60-c5906f6d061a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.640350 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c59c678b7-lbtkp_857ed116-b219-4af4-9c38-69e85db0c484/neutron-api/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.760267 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c59c678b7-lbtkp_857ed116-b219-4af4-9c38-69e85db0c484/neutron-httpd/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.869838 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97_352eaecd-6d51-4198-b3e6-ce59a6485be1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.418315 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_451ba9e3-91a7-4fd5-9e95-b827186dee9d/nova-api-log/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.485992 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_21548904-8b74-4b9b-81fb-df04e62dc7df/nova-cell0-conductor-conductor/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.604351 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_451ba9e3-91a7-4fd5-9e95-b827186dee9d/nova-api-api/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.714743 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c8d213b2-8a0b-479c-8c94-148f1afe1db0/nova-cell1-conductor-conductor/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.895413 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_80ce7ac7-056a-44ec-be77-f87a96dc23f5/nova-cell1-novncproxy-novncproxy/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.955663 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-27lb2_36bee79d-4a97-407b-9907-87d740929ba0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.254395 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b95ff32a-7b7f-43d8-b521-6d07c8d78c99/nova-metadata-log/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.402122 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a4732753-3f10-4604-89d0-0c074829e53f/nova-scheduler-scheduler/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.479153 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6108f26d-5e0a-490c-a7a4-8cefa3b99c7d/mysql-bootstrap/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.771118 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6108f26d-5e0a-490c-a7a4-8cefa3b99c7d/galera/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.804126 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6108f26d-5e0a-490c-a7a4-8cefa3b99c7d/mysql-bootstrap/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.991311 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9bafbfb6-d113-4a0f-a1dd-0d001a5448de/mysql-bootstrap/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.118658 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9bafbfb6-d113-4a0f-a1dd-0d001a5448de/mysql-bootstrap/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.125137 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b95ff32a-7b7f-43d8-b521-6d07c8d78c99/nova-metadata-metadata/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.199267 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9bafbfb6-d113-4a0f-a1dd-0d001a5448de/galera/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.334149 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_db40a4a8-ce91-40a6-8b63-ccc17ed327da/openstackclient/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.422895 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-4hqln_cacc7142-a8d4-4607-adb7-0090fbd3024a/ovn-controller/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.580918 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9xncr_7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5/openstack-network-exporter/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.703966 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l9blz_b06b991d-b108-4b21-82e5-43b3662c7aee/ovsdb-server-init/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.858710 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l9blz_b06b991d-b108-4b21-82e5-43b3662c7aee/ovs-vswitchd/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.937873 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l9blz_b06b991d-b108-4b21-82e5-43b3662c7aee/ovsdb-server-init/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.946838 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l9blz_b06b991d-b108-4b21-82e5-43b3662c7aee/ovsdb-server/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.112298 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-2p96g_41359e3c-21d7-4c22-bcef-0968c2f8cca5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.205953 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6bb656eb-1eea-436d-acf3-6d8a548a97e5/openstack-network-exporter/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.278654 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6bb656eb-1eea-436d-acf3-6d8a548a97e5/ovn-northd/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.434879 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_09719e3d-fd6c-4c22-8c15-8ef911bc6598/ovsdbserver-nb/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.523381 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_09719e3d-fd6c-4c22-8c15-8ef911bc6598/openstack-network-exporter/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.656353 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fb252ada-9191-4d2d-8ab9-d12f4668a35a/openstack-network-exporter/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.665312 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fb252ada-9191-4d2d-8ab9-d12f4668a35a/ovsdbserver-sb/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.885578 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-f49dbf586-l2cmp_2b3edac4-ba7b-4c93-b66f-43ab468d290f/placement-api/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.999562 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-f49dbf586-l2cmp_2b3edac4-ba7b-4c93-b66f-43ab468d290f/placement-log/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.039236 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d/setup-container/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.324908 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d/setup-container/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.496886 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6c46c48a-ba77-4494-bc4e-f463a4072952/setup-container/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.526500 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d/rabbitmq/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.684085 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6c46c48a-ba77-4494-bc4e-f463a4072952/setup-container/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.707804 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6c46c48a-ba77-4494-bc4e-f463a4072952/rabbitmq/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.817792 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw_ca771db8-558f-4e69-ba8c-37ed97f534b4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:35.999911 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-thhgd_e2226633-918b-423c-a329-bfd52943a1b0/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.152607 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm_ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.258704 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-fvf2b_f349798f-861c-4071-b418-61fe20227133/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.383658 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-r78nv_28d386bc-d48d-41e0-9ae2-bbe8f876ba10/ssh-known-hosts-edpm-deployment/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.644180 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66648b46df-hskmp_e22ddb6f-e33b-41ea-a24f-c97c0676e6e5/proxy-httpd/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.650153 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66648b46df-hskmp_e22ddb6f-e33b-41ea-a24f-c97c0676e6e5/proxy-server/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.713638 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-7bdl6_aa44a018-6958-4bee-895d-e7ec3966be8d/swift-ring-rebalance/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.910922 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/account-auditor/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.959657 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/account-reaper/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.013710 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/account-replicator/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.174084 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/container-auditor/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.181536 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/account-server/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.220488 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/container-replicator/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.222199 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/container-server/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.396600 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/container-updater/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.399730 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/object-auditor/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.422541 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/object-expirer/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.524460 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/object-replicator/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.608232 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/object-updater/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.646776 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/rsync/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.698785 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/object-server/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.721538 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/swift-recon-cron/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.744799 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:10:37 crc kubenswrapper[4775]: E0127 12:10:37.745093 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:10:38 crc kubenswrapper[4775]: I0127 12:10:38.013926 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4/tempest-tests-tempest-tests-runner/0.log" Jan 27 12:10:38 crc kubenswrapper[4775]: I0127 12:10:38.032309 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-trmfd_c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:38 crc kubenswrapper[4775]: I0127 12:10:38.276636 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9d17d9d1-39f7-417c-b058-cda582c7f7d3/test-operator-logs-container/0.log" Jan 27 12:10:38 crc kubenswrapper[4775]: I0127 12:10:38.303949 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8_6b092f27-cfd0-4c25-beab-c347f14371a1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:43 crc kubenswrapper[4775]: I0127 12:10:43.660083 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_07cc1808-c408-433d-aefa-f603408de606/memcached/0.log" Jan 27 12:10:51 crc kubenswrapper[4775]: I0127 12:10:51.753133 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:10:51 crc kubenswrapper[4775]: E0127 12:10:51.753869 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.305478 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/util/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.514384 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/util/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.527092 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/pull/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.564197 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/pull/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.677020 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/util/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.702014 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/pull/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.712635 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/extract/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.950533 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5fdc687f5-9wc4j_f04fa2a0-7af2-439a-9169-6edf5be65b35/manager/0.log" Jan 27 12:11:03 crc kubenswrapper[4775]: I0127 12:11:03.164675 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-76d4d5b8f9-dvj9s_c31d5b06-1ad2-4914-96c1-e0f0b8c4974e/manager/0.log" Jan 27 12:11:03 crc kubenswrapper[4775]: I0127 12:11:03.430260 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84d5bb46b-cvp5b_0cabb338-c4a1-41b4-abd6-d535b0e88406/manager/0.log" Jan 27 12:11:03 crc kubenswrapper[4775]: I0127 12:11:03.432259 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-658dd65b86-jp5c7_dd9264fb-034f-46d3-8698-dcc6fc3470f6/manager/0.log" Jan 27 12:11:03 crc kubenswrapper[4775]: I0127 12:11:03.726929 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7f5ddd8d7b-58qnd_703a739a-6687-4324-b937-7d0efe7c143b/manager/0.log" Jan 27 12:11:03 crc kubenswrapper[4775]: I0127 12:11:03.746297 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:11:03 crc kubenswrapper[4775]: E0127 12:11:03.746575 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.004933 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-58865f87b4-s2l5z_b296a3cd-1dc1-4511-af7a-7b1801e23e61/manager/0.log" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.300461 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54ccf4f85d-d7vhk_0da235e3-e76a-408f-8e0e-3cdd7ce76705/manager/0.log" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.352472 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-78f8b7b89c-2wqgg_4e719fbd-ac18-4ae1-bac6-c42f1e081daa/manager/0.log" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.505008 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-78b8f8fd84-8xrd7_6c5084e4-b0e1-46fd-ae69-c0f2ede3db17/manager/0.log" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.769856 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b88bfc995-tzn2s_56fb2890-7d29-452c-9f24-4aa20d977f0b/manager/0.log" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.936930 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-569695f6c5-pmk9t_6bcdd59a-9739-40e7-9625-3e56009dcbd7/manager/0.log" Jan 27 12:11:05 crc kubenswrapper[4775]: I0127 12:11:05.189351 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74ffd97575-cln8g_2a55fa83-c395-4ac2-bc2e-355ad48a4a95/manager/0.log" Jan 27 12:11:05 crc kubenswrapper[4775]: I0127 12:11:05.429567 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8_3e47cb1c-7f01-4b8d-904f-fed543678a02/manager/0.log" Jan 27 12:11:05 crc kubenswrapper[4775]: I0127 12:11:05.888292 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6bfcf7b875-z4vw8_8868fb89-f25b-48ef-b4e2-9acab9f78790/operator/0.log" Jan 27 12:11:06 crc kubenswrapper[4775]: I0127 12:11:06.539210 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-swjcb_56b44f0b-813c-4626-a8ec-54ac78bbb086/registry-server/0.log" Jan 27 12:11:06 crc kubenswrapper[4775]: I0127 12:11:06.776602 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-p9vts_701902fe-7e51-44b6-923b-0a60c96d6707/manager/0.log" Jan 27 12:11:06 crc kubenswrapper[4775]: I0127 12:11:06.777947 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7bf4858b78-fcd9x_7df5397d-0c1f-46b4-8695-d80c752ca569/manager/0.log" Jan 27 12:11:06 crc kubenswrapper[4775]: I0127 12:11:06.983093 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7748d79f84-vmtx4_e14198f0-3413-4350-bae5-33b23ceead05/manager/0.log" Jan 27 12:11:07 crc kubenswrapper[4775]: I0127 12:11:07.058735 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-g5nsq_a5e8d398-7976-4603-8409-304fa193f7f1/operator/0.log" Jan 27 12:11:07 crc kubenswrapper[4775]: I0127 12:11:07.335952 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-65596dbf77-9sfp8_909c9a87-2eb1-4a52-b86d-6d36524b1eb2/manager/0.log" Jan 27 12:11:07 crc kubenswrapper[4775]: I0127 12:11:07.537034 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7db57dc8bf-5lbbt_01a03f23-ead5-4a15-976f-4dda2622083b/manager/0.log" Jan 27 12:11:07 crc kubenswrapper[4775]: I0127 12:11:07.697967 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6c866cfdcb-2mz97_5070c545-d4c0-46b3-afb9-c130dc982406/manager/0.log" Jan 27 12:11:07 crc kubenswrapper[4775]: I0127 12:11:07.835330 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6476466c7c-lb4h8_bea84175-0947-45e5-a635-b7d32a0442c6/manager/0.log" Jan 27 12:11:08 crc kubenswrapper[4775]: I0127 12:11:08.183740 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-76958f4d87-8js8k_2ecfe007-a4bf-4c31-bc83-36f4c5f00815/manager/0.log" Jan 27 12:11:10 crc kubenswrapper[4775]: I0127 12:11:10.397842 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75b8f798ff-t29z2_04cbcc0c-4375-44f0-9461-b43492e9d95b/manager/0.log" Jan 27 12:11:16 crc kubenswrapper[4775]: I0127 12:11:16.745160 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:11:16 crc kubenswrapper[4775]: E0127 12:11:16.745910 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:11:28 crc kubenswrapper[4775]: I0127 12:11:28.744726 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:11:28 crc kubenswrapper[4775]: E0127 12:11:28.745858 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:11:29 crc kubenswrapper[4775]: I0127 12:11:29.444041 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gl7ql_87a94d4a-7341-4e6c-8194-a2e6832dbb01/control-plane-machine-set-operator/0.log" Jan 27 12:11:29 crc kubenswrapper[4775]: I0127 12:11:29.653872 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-sknjj_f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd/machine-api-operator/0.log" Jan 27 12:11:29 crc kubenswrapper[4775]: I0127 12:11:29.709248 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-sknjj_f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd/kube-rbac-proxy/0.log" Jan 27 12:11:41 crc kubenswrapper[4775]: I0127 12:11:41.761967 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:11:41 crc kubenswrapper[4775]: E0127 12:11:41.762732 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:11:42 crc kubenswrapper[4775]: I0127 12:11:42.023137 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-xpr9c_6b64e5cd-1b80-489b-8d69-3ebf7862eb9f/cert-manager-controller/0.log" Jan 27 12:11:42 crc kubenswrapper[4775]: I0127 12:11:42.229985 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-4sq7k_ea378b66-945f-4832-b293-59576474b63c/cert-manager-cainjector/0.log" Jan 27 12:11:42 crc kubenswrapper[4775]: I0127 12:11:42.300639 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-5w45m_882dbf86-77c4-46a5-a75b-b7b4a70d3ac1/cert-manager-webhook/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.077356 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-tm9vw_76d9c92d-c012-448b-8ff5-00f10c17c5a7/nmstate-console-plugin/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.291167 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-4vtwf_0aa6cbcb-077f-4ae7-85b2-d79679ef64df/nmstate-handler/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.344712 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-2qhwx_4c84a5ec-b41d-4396-adea-3c9964cc7c59/kube-rbac-proxy/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.406067 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-2qhwx_4c84a5ec-b41d-4396-adea-3c9964cc7c59/nmstate-metrics/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.537976 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-znzng_cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f/nmstate-operator/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.628976 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-d9lzh_d9f9feec-ee04-44de-8879-4071243ac6db/nmstate-webhook/0.log" Jan 27 12:11:55 crc kubenswrapper[4775]: I0127 12:11:55.745340 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:11:55 crc kubenswrapper[4775]: E0127 12:11:55.745743 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:12:06 crc kubenswrapper[4775]: I0127 12:12:06.744981 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:12:06 crc kubenswrapper[4775]: E0127 12:12:06.745920 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:12:19 crc kubenswrapper[4775]: I0127 12:12:19.745525 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:12:19 crc kubenswrapper[4775]: E0127 12:12:19.746253 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.012828 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-4tjsf_6bd75754-cf96-4b57-bfd3-711aa3dc06e6/kube-rbac-proxy/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.149664 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-4tjsf_6bd75754-cf96-4b57-bfd3-711aa3dc06e6/controller/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.223716 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-frr-files/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.461727 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-frr-files/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.479694 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-reloader/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.491172 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-reloader/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.494672 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-metrics/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.728079 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-frr-files/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.739790 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-reloader/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.781840 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-metrics/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.833178 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-metrics/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.985898 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-metrics/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.010211 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-reloader/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.019223 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-frr-files/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.055434 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/controller/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.217133 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/frr-metrics/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.256759 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/kube-rbac-proxy/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.307319 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/kube-rbac-proxy-frr/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.469751 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/reloader/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.631847 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-ht6jz_de8a1d9c-9c8b-4200-92ae-b82c65b24d56/frr-k8s-webhook-server/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.844223 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c8c7fc46c-g7l74_7560029a-575e-4d87-b4e8-4f090c5a7cd9/manager/0.log" Jan 27 12:12:23 crc kubenswrapper[4775]: I0127 12:12:23.020972 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6b85bfbbbb-bb966_acb19b04-4cd3-4304-a572-d25d4aa2932b/webhook-server/0.log" Jan 27 12:12:23 crc kubenswrapper[4775]: I0127 12:12:23.190096 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qm9dq_5573a041-6f7e-4c23-b2ea-42de01c96cdd/kube-rbac-proxy/0.log" Jan 27 12:12:23 crc kubenswrapper[4775]: I0127 12:12:23.966702 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qm9dq_5573a041-6f7e-4c23-b2ea-42de01c96cdd/speaker/0.log" Jan 27 12:12:24 crc kubenswrapper[4775]: I0127 12:12:24.034750 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/frr/0.log" Jan 27 12:12:34 crc kubenswrapper[4775]: I0127 12:12:34.745679 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:12:34 crc kubenswrapper[4775]: E0127 12:12:34.746520 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.090406 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/util/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.336862 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/pull/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.337010 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/pull/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.351627 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/util/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.518769 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/util/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.526851 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/extract/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.548945 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/pull/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.705310 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/util/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.892716 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/pull/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.904573 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/pull/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.947944 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/util/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.125639 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/extract/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.151152 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/util/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.172569 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/pull/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.304485 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-utilities/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.492604 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-utilities/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.509061 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-content/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.524731 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-content/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.676976 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-utilities/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.741914 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-content/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.901878 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-utilities/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.069307 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/registry-server/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.172621 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-content/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.177168 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-content/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.211640 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-utilities/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.381960 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-content/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.414087 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-utilities/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.630959 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/1.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.712801 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/2.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.844771 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/registry-server/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.871369 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-utilities/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.025988 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.061738 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.061794 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-utilities/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.240276 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.244055 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-utilities/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.358232 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/registry-server/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.462549 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-utilities/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.635072 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-utilities/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.674788 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.696109 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.817806 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.855562 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-utilities/0.log" Jan 27 12:12:40 crc kubenswrapper[4775]: I0127 12:12:40.368916 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/registry-server/0.log" Jan 27 12:12:48 crc kubenswrapper[4775]: I0127 12:12:48.745244 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:12:48 crc kubenswrapper[4775]: E0127 12:12:48.747173 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:13:00 crc kubenswrapper[4775]: I0127 12:13:00.744960 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:13:00 crc kubenswrapper[4775]: E0127 12:13:00.745753 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:13:11 crc kubenswrapper[4775]: I0127 12:13:11.753783 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:13:11 crc kubenswrapper[4775]: E0127 12:13:11.754737 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:13:26 crc kubenswrapper[4775]: I0127 12:13:26.745267 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:13:26 crc kubenswrapper[4775]: E0127 12:13:26.746181 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:13:41 crc kubenswrapper[4775]: I0127 12:13:41.753182 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:13:41 crc kubenswrapper[4775]: E0127 12:13:41.754091 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:13:54 crc kubenswrapper[4775]: I0127 12:13:54.751439 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:13:54 crc kubenswrapper[4775]: E0127 12:13:54.753946 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:14:09 crc kubenswrapper[4775]: I0127 12:14:09.747964 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:14:09 crc kubenswrapper[4775]: E0127 12:14:09.749084 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:14:21 crc kubenswrapper[4775]: I0127 12:14:21.754055 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:14:21 crc kubenswrapper[4775]: E0127 12:14:21.755026 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:14:30 crc kubenswrapper[4775]: I0127 12:14:30.007475 4775 generic.go:334] "Generic (PLEG): container finished" podID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerID="14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b" exitCode=0 Jan 27 12:14:30 crc kubenswrapper[4775]: I0127 12:14:30.007580 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/must-gather-wqwn4" event={"ID":"09caf0cd-6a8c-41d8-84a7-7813e19a373a","Type":"ContainerDied","Data":"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b"} Jan 27 12:14:30 crc kubenswrapper[4775]: I0127 12:14:30.008761 4775 scope.go:117] "RemoveContainer" containerID="14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b" Jan 27 12:14:30 crc kubenswrapper[4775]: I0127 12:14:30.502007 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7p92f_must-gather-wqwn4_09caf0cd-6a8c-41d8-84a7-7813e19a373a/gather/0.log" Jan 27 12:14:34 crc kubenswrapper[4775]: I0127 12:14:34.745858 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:14:34 crc kubenswrapper[4775]: E0127 12:14:34.746713 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:14:39 crc kubenswrapper[4775]: I0127 12:14:39.377511 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7p92f/must-gather-wqwn4"] Jan 27 12:14:39 crc kubenswrapper[4775]: I0127 12:14:39.378355 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7p92f/must-gather-wqwn4" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="copy" containerID="cri-o://d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3" gracePeriod=2 Jan 27 12:14:39 crc kubenswrapper[4775]: I0127 12:14:39.403840 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7p92f/must-gather-wqwn4"] Jan 27 12:14:39 crc kubenswrapper[4775]: I0127 12:14:39.905927 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7p92f_must-gather-wqwn4_09caf0cd-6a8c-41d8-84a7-7813e19a373a/copy/0.log" Jan 27 12:14:39 crc kubenswrapper[4775]: I0127 12:14:39.906576 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.042639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output\") pod \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.042813 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84zh6\" (UniqueName: \"kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6\") pod \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.061356 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6" (OuterVolumeSpecName: "kube-api-access-84zh6") pod "09caf0cd-6a8c-41d8-84a7-7813e19a373a" (UID: "09caf0cd-6a8c-41d8-84a7-7813e19a373a"). InnerVolumeSpecName "kube-api-access-84zh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.102237 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7p92f_must-gather-wqwn4_09caf0cd-6a8c-41d8-84a7-7813e19a373a/copy/0.log" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.102578 4775 generic.go:334] "Generic (PLEG): container finished" podID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerID="d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3" exitCode=143 Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.102631 4775 scope.go:117] "RemoveContainer" containerID="d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.102775 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.132147 4775 scope.go:117] "RemoveContainer" containerID="14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.145246 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84zh6\" (UniqueName: \"kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6\") on node \"crc\" DevicePath \"\"" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.226170 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "09caf0cd-6a8c-41d8-84a7-7813e19a373a" (UID: "09caf0cd-6a8c-41d8-84a7-7813e19a373a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.246756 4775 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.256117 4775 scope.go:117] "RemoveContainer" containerID="d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3" Jan 27 12:14:40 crc kubenswrapper[4775]: E0127 12:14:40.257909 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3\": container with ID starting with d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3 not found: ID does not exist" containerID="d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.257970 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3"} err="failed to get container status \"d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3\": rpc error: code = NotFound desc = could not find container \"d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3\": container with ID starting with d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3 not found: ID does not exist" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.258000 4775 scope.go:117] "RemoveContainer" containerID="14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b" Jan 27 12:14:40 crc kubenswrapper[4775]: E0127 12:14:40.259100 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b\": container with ID starting with 14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b not found: ID does not exist" containerID="14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.259158 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b"} err="failed to get container status \"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b\": rpc error: code = NotFound desc = could not find container \"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b\": container with ID starting with 14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b not found: ID does not exist" Jan 27 12:14:41 crc kubenswrapper[4775]: I0127 12:14:41.760479 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" path="/var/lib/kubelet/pods/09caf0cd-6a8c-41d8-84a7-7813e19a373a/volumes" Jan 27 12:14:49 crc kubenswrapper[4775]: I0127 12:14:49.745582 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:14:49 crc kubenswrapper[4775]: E0127 12:14:49.747555 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.223574 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:14:51 crc kubenswrapper[4775]: E0127 12:14:51.224789 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e8da4d-550a-40eb-b851-4e7f2b637352" containerName="container-00" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.224814 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e8da4d-550a-40eb-b851-4e7f2b637352" containerName="container-00" Jan 27 12:14:51 crc kubenswrapper[4775]: E0127 12:14:51.224832 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="copy" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.224840 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="copy" Jan 27 12:14:51 crc kubenswrapper[4775]: E0127 12:14:51.224855 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="gather" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.224894 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="gather" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.225147 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="gather" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.225164 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="copy" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.225187 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e8da4d-550a-40eb-b851-4e7f2b637352" containerName="container-00" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.227109 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.242710 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.377295 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.377386 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78vzs\" (UniqueName: \"kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.377538 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.480634 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.480755 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78vzs\" (UniqueName: \"kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.480929 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.481521 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.481723 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.510948 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78vzs\" (UniqueName: \"kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.563031 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.944785 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:14:52 crc kubenswrapper[4775]: I0127 12:14:52.216588 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerStarted","Data":"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209"} Jan 27 12:14:52 crc kubenswrapper[4775]: I0127 12:14:52.216637 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerStarted","Data":"e10b0c1057c8fe3dc02028a5148e954d203d6bc10424ddb7a56cc51b4b561ace"} Jan 27 12:14:53 crc kubenswrapper[4775]: I0127 12:14:53.225968 4775 generic.go:334] "Generic (PLEG): container finished" podID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerID="8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209" exitCode=0 Jan 27 12:14:53 crc kubenswrapper[4775]: I0127 12:14:53.226109 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerDied","Data":"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209"} Jan 27 12:14:53 crc kubenswrapper[4775]: I0127 12:14:53.229105 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 12:14:54 crc kubenswrapper[4775]: I0127 12:14:54.236858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerStarted","Data":"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30"} Jan 27 12:14:58 crc kubenswrapper[4775]: I0127 12:14:58.268944 4775 generic.go:334] "Generic (PLEG): container finished" podID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerID="8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30" exitCode=0 Jan 27 12:14:58 crc kubenswrapper[4775]: I0127 12:14:58.269043 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerDied","Data":"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30"} Jan 27 12:14:59 crc kubenswrapper[4775]: I0127 12:14:59.280900 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerStarted","Data":"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03"} Jan 27 12:14:59 crc kubenswrapper[4775]: I0127 12:14:59.311914 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n5z9l" podStartSLOduration=2.581519847 podStartE2EDuration="8.311886425s" podCreationTimestamp="2026-01-27 12:14:51 +0000 UTC" firstStartedPulling="2026-01-27 12:14:53.228908937 +0000 UTC m=+3272.370506704" lastFinishedPulling="2026-01-27 12:14:58.959275505 +0000 UTC m=+3278.100873282" observedRunningTime="2026-01-27 12:14:59.303823117 +0000 UTC m=+3278.445420914" watchObservedRunningTime="2026-01-27 12:14:59.311886425 +0000 UTC m=+3278.453484202" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.156943 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89"] Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.158969 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.161562 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.162216 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.190099 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89"] Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.262114 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqpd7\" (UniqueName: \"kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.262505 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.262565 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.365090 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.365150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.365279 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqpd7\" (UniqueName: \"kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.366351 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.371710 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.386983 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqpd7\" (UniqueName: \"kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.482233 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.957157 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89"] Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.304006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" event={"ID":"c654a0c1-9099-4854-bf80-86bf948aac80","Type":"ContainerStarted","Data":"c252616439a676bb9e6b06343c551fab6d8b758f7a28eff5e2046c4ca7050ea8"} Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.304576 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" event={"ID":"c654a0c1-9099-4854-bf80-86bf948aac80","Type":"ContainerStarted","Data":"1be808e79337c5fa8ad48d85374ede604629a29f9ea354f83487d4c02d3a6319"} Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.329641 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" podStartSLOduration=1.32962109 podStartE2EDuration="1.32962109s" podCreationTimestamp="2026-01-27 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 12:15:01.328716705 +0000 UTC m=+3280.470314472" watchObservedRunningTime="2026-01-27 12:15:01.32962109 +0000 UTC m=+3280.471218867" Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.563797 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.563856 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.608756 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:02 crc kubenswrapper[4775]: I0127 12:15:02.314752 4775 generic.go:334] "Generic (PLEG): container finished" podID="c654a0c1-9099-4854-bf80-86bf948aac80" containerID="c252616439a676bb9e6b06343c551fab6d8b758f7a28eff5e2046c4ca7050ea8" exitCode=0 Jan 27 12:15:02 crc kubenswrapper[4775]: I0127 12:15:02.314823 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" event={"ID":"c654a0c1-9099-4854-bf80-86bf948aac80","Type":"ContainerDied","Data":"c252616439a676bb9e6b06343c551fab6d8b758f7a28eff5e2046c4ca7050ea8"} Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.666359 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.729354 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume\") pod \"c654a0c1-9099-4854-bf80-86bf948aac80\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.729435 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqpd7\" (UniqueName: \"kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7\") pod \"c654a0c1-9099-4854-bf80-86bf948aac80\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.729619 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume\") pod \"c654a0c1-9099-4854-bf80-86bf948aac80\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.730289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume" (OuterVolumeSpecName: "config-volume") pod "c654a0c1-9099-4854-bf80-86bf948aac80" (UID: "c654a0c1-9099-4854-bf80-86bf948aac80"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.735732 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c654a0c1-9099-4854-bf80-86bf948aac80" (UID: "c654a0c1-9099-4854-bf80-86bf948aac80"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.735839 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7" (OuterVolumeSpecName: "kube-api-access-qqpd7") pod "c654a0c1-9099-4854-bf80-86bf948aac80" (UID: "c654a0c1-9099-4854-bf80-86bf948aac80"). InnerVolumeSpecName "kube-api-access-qqpd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.833276 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.833512 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.833573 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqpd7\" (UniqueName: \"kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:04 crc kubenswrapper[4775]: E0127 12:15:04.003772 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc654a0c1_9099_4854_bf80_86bf948aac80.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc654a0c1_9099_4854_bf80_86bf948aac80.slice/crio-1be808e79337c5fa8ad48d85374ede604629a29f9ea354f83487d4c02d3a6319\": RecentStats: unable to find data in memory cache]" Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.333322 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" event={"ID":"c654a0c1-9099-4854-bf80-86bf948aac80","Type":"ContainerDied","Data":"1be808e79337c5fa8ad48d85374ede604629a29f9ea354f83487d4c02d3a6319"} Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.333357 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1be808e79337c5fa8ad48d85374ede604629a29f9ea354f83487d4c02d3a6319" Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.333690 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.400385 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj"] Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.409715 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj"] Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.744951 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:15:05 crc kubenswrapper[4775]: I0127 12:15:05.756201 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" path="/var/lib/kubelet/pods/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9/volumes" Jan 27 12:15:06 crc kubenswrapper[4775]: I0127 12:15:06.353638 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"9b7ace790e0aa7d5a9cb6a8918be4ce2919c74f847ec7ba5948065c26c7daa93"} Jan 27 12:15:11 crc kubenswrapper[4775]: I0127 12:15:11.610248 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:11 crc kubenswrapper[4775]: I0127 12:15:11.660936 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:15:12 crc kubenswrapper[4775]: I0127 12:15:12.406246 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n5z9l" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerName="registry-server" containerID="cri-o://b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03" gracePeriod=2 Jan 27 12:15:12 crc kubenswrapper[4775]: I0127 12:15:12.949604 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.119585 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities\") pod \"cbff1de5-dd70-4733-8a6f-8538d9940aee\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.120191 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78vzs\" (UniqueName: \"kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs\") pod \"cbff1de5-dd70-4733-8a6f-8538d9940aee\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.120324 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content\") pod \"cbff1de5-dd70-4733-8a6f-8538d9940aee\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.120761 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities" (OuterVolumeSpecName: "utilities") pod "cbff1de5-dd70-4733-8a6f-8538d9940aee" (UID: "cbff1de5-dd70-4733-8a6f-8538d9940aee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.121061 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.126924 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs" (OuterVolumeSpecName: "kube-api-access-78vzs") pod "cbff1de5-dd70-4733-8a6f-8538d9940aee" (UID: "cbff1de5-dd70-4733-8a6f-8538d9940aee"). InnerVolumeSpecName "kube-api-access-78vzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.173295 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbff1de5-dd70-4733-8a6f-8538d9940aee" (UID: "cbff1de5-dd70-4733-8a6f-8538d9940aee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.223290 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78vzs\" (UniqueName: \"kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.223341 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.416950 4775 generic.go:334] "Generic (PLEG): container finished" podID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerID="b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03" exitCode=0 Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.416994 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerDied","Data":"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03"} Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.417020 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerDied","Data":"e10b0c1057c8fe3dc02028a5148e954d203d6bc10424ddb7a56cc51b4b561ace"} Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.417038 4775 scope.go:117] "RemoveContainer" containerID="b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.417148 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.447840 4775 scope.go:117] "RemoveContainer" containerID="8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.454397 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.466511 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.471303 4775 scope.go:117] "RemoveContainer" containerID="8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.514556 4775 scope.go:117] "RemoveContainer" containerID="b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03" Jan 27 12:15:13 crc kubenswrapper[4775]: E0127 12:15:13.514928 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03\": container with ID starting with b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03 not found: ID does not exist" containerID="b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.514983 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03"} err="failed to get container status \"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03\": rpc error: code = NotFound desc = could not find container \"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03\": container with ID starting with b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03 not found: ID does not exist" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.515017 4775 scope.go:117] "RemoveContainer" containerID="8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30" Jan 27 12:15:13 crc kubenswrapper[4775]: E0127 12:15:13.515377 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30\": container with ID starting with 8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30 not found: ID does not exist" containerID="8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.515429 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30"} err="failed to get container status \"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30\": rpc error: code = NotFound desc = could not find container \"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30\": container with ID starting with 8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30 not found: ID does not exist" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.515481 4775 scope.go:117] "RemoveContainer" containerID="8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209" Jan 27 12:15:13 crc kubenswrapper[4775]: E0127 12:15:13.515817 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209\": container with ID starting with 8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209 not found: ID does not exist" containerID="8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.515840 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209"} err="failed to get container status \"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209\": rpc error: code = NotFound desc = could not find container \"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209\": container with ID starting with 8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209 not found: ID does not exist" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.774559 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" path="/var/lib/kubelet/pods/cbff1de5-dd70-4733-8a6f-8538d9940aee/volumes" Jan 27 12:15:42 crc kubenswrapper[4775]: I0127 12:15:42.605724 4775 scope.go:117] "RemoveContainer" containerID="baa01a4c6fe93fc697e5252cef256367e24ac68983a3bf4c9c9429de1629fe05" Jan 27 12:15:42 crc kubenswrapper[4775]: I0127 12:15:42.631672 4775 scope.go:117] "RemoveContainer" containerID="0f3580828c538a1fd2620d795cca4ebbc4512c90dd73f2436a5638637886ada1" Jan 27 12:17:29 crc kubenswrapper[4775]: I0127 12:17:29.517277 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:17:29 crc kubenswrapper[4775]: I0127 12:17:29.517958 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:17:59 crc kubenswrapper[4775]: I0127 12:17:59.517882 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:17:59 crc kubenswrapper[4775]: I0127 12:17:59.518637 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:18:29 crc kubenswrapper[4775]: I0127 12:18:29.517693 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:18:29 crc kubenswrapper[4775]: I0127 12:18:29.518269 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:18:29 crc kubenswrapper[4775]: I0127 12:18:29.518326 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 12:18:29 crc kubenswrapper[4775]: I0127 12:18:29.519218 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b7ace790e0aa7d5a9cb6a8918be4ce2919c74f847ec7ba5948065c26c7daa93"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 12:18:29 crc kubenswrapper[4775]: I0127 12:18:29.519277 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://9b7ace790e0aa7d5a9cb6a8918be4ce2919c74f847ec7ba5948065c26c7daa93" gracePeriod=600 Jan 27 12:18:30 crc kubenswrapper[4775]: I0127 12:18:30.189435 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="9b7ace790e0aa7d5a9cb6a8918be4ce2919c74f847ec7ba5948065c26c7daa93" exitCode=0 Jan 27 12:18:30 crc kubenswrapper[4775]: I0127 12:18:30.189650 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"9b7ace790e0aa7d5a9cb6a8918be4ce2919c74f847ec7ba5948065c26c7daa93"} Jan 27 12:18:30 crc kubenswrapper[4775]: I0127 12:18:30.189760 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"b2e577152d15d42a5efc34f6f6360a6dcf7c4bbfdf60c7e85ea647d72137b971"} Jan 27 12:18:30 crc kubenswrapper[4775]: I0127 12:18:30.189791 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.531001 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fv6vp"] Jan 27 12:18:43 crc kubenswrapper[4775]: E0127 12:18:43.533266 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerName="extract-utilities" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.533357 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerName="extract-utilities" Jan 27 12:18:43 crc kubenswrapper[4775]: E0127 12:18:43.533421 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerName="extract-content" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.533509 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerName="extract-content" Jan 27 12:18:43 crc kubenswrapper[4775]: E0127 12:18:43.533587 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerName="registry-server" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.533640 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerName="registry-server" Jan 27 12:18:43 crc kubenswrapper[4775]: E0127 12:18:43.533697 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c654a0c1-9099-4854-bf80-86bf948aac80" containerName="collect-profiles" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.533754 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c654a0c1-9099-4854-bf80-86bf948aac80" containerName="collect-profiles" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.534197 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerName="registry-server" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.534292 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c654a0c1-9099-4854-bf80-86bf948aac80" containerName="collect-profiles" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.535777 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.547562 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fv6vp"] Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.689441 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75404da3-3f03-4fe8-ac34-21e38807514d-catalog-content\") pod \"redhat-operators-fv6vp\" (UID: \"75404da3-3f03-4fe8-ac34-21e38807514d\") " pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.689643 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75404da3-3f03-4fe8-ac34-21e38807514d-utilities\") pod \"redhat-operators-fv6vp\" (UID: \"75404da3-3f03-4fe8-ac34-21e38807514d\") " pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.689711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npqc4\" (UniqueName: \"kubernetes.io/projected/75404da3-3f03-4fe8-ac34-21e38807514d-kube-api-access-npqc4\") pod \"redhat-operators-fv6vp\" (UID: \"75404da3-3f03-4fe8-ac34-21e38807514d\") " pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.791316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75404da3-3f03-4fe8-ac34-21e38807514d-catalog-content\") pod \"redhat-operators-fv6vp\" (UID: \"75404da3-3f03-4fe8-ac34-21e38807514d\") " pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.791423 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75404da3-3f03-4fe8-ac34-21e38807514d-utilities\") pod \"redhat-operators-fv6vp\" (UID: \"75404da3-3f03-4fe8-ac34-21e38807514d\") " pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.791505 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npqc4\" (UniqueName: \"kubernetes.io/projected/75404da3-3f03-4fe8-ac34-21e38807514d-kube-api-access-npqc4\") pod \"redhat-operators-fv6vp\" (UID: \"75404da3-3f03-4fe8-ac34-21e38807514d\") " pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.792379 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75404da3-3f03-4fe8-ac34-21e38807514d-catalog-content\") pod \"redhat-operators-fv6vp\" (UID: \"75404da3-3f03-4fe8-ac34-21e38807514d\") " pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.792643 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75404da3-3f03-4fe8-ac34-21e38807514d-utilities\") pod \"redhat-operators-fv6vp\" (UID: \"75404da3-3f03-4fe8-ac34-21e38807514d\") " pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.823218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npqc4\" (UniqueName: \"kubernetes.io/projected/75404da3-3f03-4fe8-ac34-21e38807514d-kube-api-access-npqc4\") pod \"redhat-operators-fv6vp\" (UID: \"75404da3-3f03-4fe8-ac34-21e38807514d\") " pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:43 crc kubenswrapper[4775]: I0127 12:18:43.869855 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:18:44 crc kubenswrapper[4775]: I0127 12:18:44.369126 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fv6vp"] Jan 27 12:18:45 crc kubenswrapper[4775]: I0127 12:18:45.349209 4775 generic.go:334] "Generic (PLEG): container finished" podID="75404da3-3f03-4fe8-ac34-21e38807514d" containerID="8483a57ff95d910dad213f7b4b4c9b118d58b00b66addb2d76b68ac84bb848bc" exitCode=0 Jan 27 12:18:45 crc kubenswrapper[4775]: I0127 12:18:45.349260 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fv6vp" event={"ID":"75404da3-3f03-4fe8-ac34-21e38807514d","Type":"ContainerDied","Data":"8483a57ff95d910dad213f7b4b4c9b118d58b00b66addb2d76b68ac84bb848bc"} Jan 27 12:18:45 crc kubenswrapper[4775]: I0127 12:18:45.349291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fv6vp" event={"ID":"75404da3-3f03-4fe8-ac34-21e38807514d","Type":"ContainerStarted","Data":"b76e906fb1a6b4055cd6e61ee91d0f37e79cc32a35d203abab7ba574cff38f5d"} Jan 27 12:18:47 crc kubenswrapper[4775]: I0127 12:18:47.372312 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fv6vp" event={"ID":"75404da3-3f03-4fe8-ac34-21e38807514d","Type":"ContainerStarted","Data":"c8e91740f64b115382e870b5a484863f5dd7c02847493171861102cd0f2ab30d"} Jan 27 12:18:59 crc kubenswrapper[4775]: I0127 12:18:59.609892 4775 generic.go:334] "Generic (PLEG): container finished" podID="75404da3-3f03-4fe8-ac34-21e38807514d" containerID="c8e91740f64b115382e870b5a484863f5dd7c02847493171861102cd0f2ab30d" exitCode=0 Jan 27 12:18:59 crc kubenswrapper[4775]: I0127 12:18:59.610027 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fv6vp" event={"ID":"75404da3-3f03-4fe8-ac34-21e38807514d","Type":"ContainerDied","Data":"c8e91740f64b115382e870b5a484863f5dd7c02847493171861102cd0f2ab30d"} Jan 27 12:19:00 crc kubenswrapper[4775]: I0127 12:19:00.622307 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fv6vp" event={"ID":"75404da3-3f03-4fe8-ac34-21e38807514d","Type":"ContainerStarted","Data":"af3b5dcda6b84ff244c277bd97c9e3d24401234cd5f80073d7f4d9642d231c42"} Jan 27 12:19:00 crc kubenswrapper[4775]: I0127 12:19:00.645939 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fv6vp" podStartSLOduration=2.898307367 podStartE2EDuration="17.645918752s" podCreationTimestamp="2026-01-27 12:18:43 +0000 UTC" firstStartedPulling="2026-01-27 12:18:45.351339878 +0000 UTC m=+3504.492937675" lastFinishedPulling="2026-01-27 12:19:00.098951283 +0000 UTC m=+3519.240549060" observedRunningTime="2026-01-27 12:19:00.639889348 +0000 UTC m=+3519.781487145" watchObservedRunningTime="2026-01-27 12:19:00.645918752 +0000 UTC m=+3519.787516529" Jan 27 12:19:03 crc kubenswrapper[4775]: I0127 12:19:03.870236 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:19:03 crc kubenswrapper[4775]: I0127 12:19:03.871899 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fv6vp" Jan 27 12:19:04 crc kubenswrapper[4775]: I0127 12:19:04.925472 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fv6vp" podUID="75404da3-3f03-4fe8-ac34-21e38807514d" containerName="registry-server" probeResult="failure" output=< Jan 27 12:19:04 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 12:19:04 crc kubenswrapper[4775]: > var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515136126507024453 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015136126507017370 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015136117255016513 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015136117255015463 5ustar corecore